• Request Info
  • Visit
  • Apply
  • Give
  • Request Info
  • Visit
  • Apply
  • Give

Search

  • A-Z Index
  • Map
Education Research
  • Request Info
  • Visit
  • Apply
  • Give
Cross

Educational Leadership and Policy Studies

  • About
  • Our People
    • Our People Overview
    • Faculty
    • Staff
    • Students
  • Academic Programs
    • Academic Programs Overview
    • Adult & Continuing Education
    • College Student Personnel
    • Educational Administration
    • Evaluation Methodology
    • Higher Education Administration
    • Undergraduate Studies
  • Education Research & Opportunity Center
  • Admissions & Information
    • Admissions Overview
    • Graduate Forms, Handbooks, and Resources
    • Contact ELPS
  • Request Info
  • Visit
  • Apply
  • Give
  • About
  • Our People
    • Our People Overview
    • Faculty
    • Staff
    • Students
  • Academic Programs
    • Academic Programs Overview
    • Adult & Continuing Education
    • College Student Personnel
    • Educational Administration
    • Evaluation Methodology
    • Higher Education Administration
    • Undergraduate Studies
  • Education Research & Opportunity Center
  • Admissions & Information
    • Admissions Overview
    • Graduate Forms, Handbooks, and Resources
    • Contact ELPS
  1. Educational Leadership and Policy Studies
  2. Evaluation Methodology Blog

Evaluation Methodology Blog

The What, When, Why, and How of Formative Evaluation of Instruction

December 1, 2023 by jhall152

By M. Andrew Young

Hello! My name is M. Andrew Young. I am a second-year Ph.D. student in the Evaluation, Statistics, and Methodology Ph.D. program here at UT-Knoxville. I currently work in higher education assessment as a Director of Assessment at East Tennessee State University’s college of Pharmacy. As part of my duties, I am frequently called upon to conduct classroom assessments.  

Higher education assessment often employs the usage of summative evaluation of instruction, also commonly known as course evaluations, summative assessment of instruction (SAI), summative evaluation of instruction (SEI), among other titles, at the end of a course. At my institution the purpose of summative evaluation of instruction is primarily centered on evaluating faculty for tenure, promotion, and retention. What if there were a more student-centered approach to getting classroom evaluation feedback that not only benefits students in future classes (like summative assessment does), but also benefits students currently enrolled in the class? Enter formative evaluation of instruction, (FEI).  

 

What is FEI? 

FEI, sometimes referred to as midterm evaluations, entails seeking feedback from students prior to the semester midpoint to make mid-stream changes that will address each cohort’s individual learning needs. Collecting such meaningful and actionable FEI can prove to be challenging. Sometimes faculty may prefer to not participate in formative evaluation because they do not find the feedback from students actionable, or they may not value the student input. Furthermore, there is little direction on how to conduct this feedback and how to use it for continual quality improvement in the classroom. While there exists a lot of literature on summative evaluation of teaching, there seems to be a dearth of research surrounding best practices for formative evaluation of teaching. The few articles that I have been able to discover offer suggestions for FEI covered later in this post. 

 

When Should We Use FEI? 

In my opinion, every classroom can benefit from formative evaluation. When to administer it is as much an art as it is a science. Timing is everything and the results can differ greatly depending on the timing of the administration of the evaluation. In my time working as a Director of Assessment, I have found that the most meaningful feedback can be gathered in the first half of the semester, directly after a major assessment. Students have a better understanding of their comprehension of the material and the effectiveness of the classroom instruction. There is very little literature to support this, so this is purely anecdotal. None of the resources I have found have prescribed precisely when FEI should be conducted, but the name implies that the feedback should be sought at or around the semester midpoint. 

 

Why Should We Conduct FEI? 

FEI Can Help:

  • Improve student satisfaction on summative feedback of instruction (Snooks et al., 2007; Veeck et al., 2016),  
  • Make substantive changes to the classroom experience including textbooks, examinations/assessments of learning, and instructional methods (Snooks et al., 2007; Taylor et al., 2020) 
  • Strengthen teaching and improving rapport between students and faculty (Snooks et al., 2007; Taylor et al., 2020) 
  • Improve faculty development including promotion and tenure (Taylor et al., 2020; Veeck et al., 2016), encouraging active learning (Taylor et al., 2020) 
  • Bolster communication of expectations in a reciprocal relationship between instructor and student (Snooks et al., 2007; Taylor et al., 2020). 

 

How Should We Administer the FEI? 

Research has provided a wide variety of suggested practices including, but not limited to involving a facilitator for the formative evaluation, asking open-ended questions, using no more than ten minutes of classroom time, keeping it anonymous, and keeping it short (Holt & Moore, 1992; Snooks et al., 2007; Taylor et al., 2020), and even having students work in groups to provide the feedback or student conferencing (Fluckiger et al., 2010; Veeck et al., 2016).  

Hanover (2022) concluded that formative evaluation should include elements of: a 7-point Likert scale question evaluating how the course is going for the student followed by an open-ended explanation of rating question, involving the “Keep, Stop, Start” model with open-ended response-style questions, and finally, open-ended questions that allow students to suggest changes and provide additional feedback on the course and/or instructor. The “Keep, Stop, Start” model is applied by asking students what they would like the instructors to keep doing, stop doing, and/or start doing. In the college of pharmacy, we use the method that Hanover presented where we ask students to self-evaluate how well they feel they are doing in the class, and then explain their rating with an open-ended, free-response field. This has only been in practice at the college of pharmacy for the past academic year, and anecdotally from conversation with faculty, the data that has been collected has generally been more actionable for the faculty. Like all evaluations, it is not a perfect system and sometimes some of the data is not actionable, but in our college FEI is an integral part of indirect classroom assessment. The purpose is to collect and analyze themes that are associated with the different levels of evaluation rating. (Best Practices in Designing Course Evaluations, 2022). The most important step, however, is to close the feedback loop in a timely manner (Fluckiger et al., 2010; Taylor et al., 2020; Veeck et al., 2016). Closing the feedback loop for our purposes is essentially asking the course coordinator to respond to the feedback given in the FEI, usually within a week’s time, and detailing what changes, if any, will be made in the classroom and learning environment. Obviously, not all feedback is actionable, and in some cases, best practices in the literature conflict with suggestions made, but it is important for the students to know what can be changed and what cannot/will not be changed and why. 

 

What Remains? 

Some accrediting bodies (like the American Council for Pharmacy Education, or ACPE), require colleges to have an avenue for formative student feedback as part of their standards. I believe that formative evaluation benefits students and faculty alike, and where it may be too early to make a sweeping change and require FEI for every higher education institution, there may be value in educating faculty and assessment professionals of the benefits of FEI. Although outside the scope of this short blog post, adopting FEI as a common practice should be approached carefully, intentionally, and with best practices for change management in organizations. Some final thoughts: in order to get the students engaged in providing good feedback, ideally the practice of FEI has to be championed by the faculty. While it could be mandated by administration, that practice would likely not engender as much buy-in, and if the faculty, who are the primary touch-points for the students, aren’t sold on the practice or participate begrudgingly, that will create an environment where the data collected is not optimal and/or actionable. Students talk with each other across cohorts. If students in upper classes have a negative opinion on the process, that will have a negative trickle-down effect. What is the best way to make students disengage? Don’t close the feedback loop. 

 

References and Resources 

Best Practices in Designing Course Evaluations. (2022). Hanover Research. 

Fluckiger, J., Tixier, Y., Pasco, R., & Danielson, K. (2010). Formative Feedback: Involving Students as Partners in Assessment to Enhance Learning. College Teaching, 58, 136–140. https://doi.org/10.1080/87567555.2010.484031 

Holt, M. E., & Moore, A. B. (1992). Checking Halfway: The Value of Midterm Course Evaluation. Evaluation Practice, 13(1), 47–50. 

Snooks, M. K., Neeley, S. E., & Revere, L. (2007). Midterm Student Feedback: Results of a Pilot Study. Journal on Excellence in College Teaching, 18(3), 55–73. 

Taylor, R. L., Knorr, K., Ogrodnik, M., & Sinclair, P. (2020). Seven principles for good practice in midterm student feedback. International Journal for Academic Development, 25(4), 350–362. 

Veeck, A., O’Reilly, K., MacMillan, A., & Yu, H. (2016). The Use of Collaborative Midterm Student Evaluations to Provide Actionable Results. Journal of Marketing Education, 38(3), 157–169. https://doi.org/10.1177/0273475315619652 

 

Filed Under: Evaluation Methodology Blog

Evaluation in the Age of Emerging Technologies

November 15, 2023 by jhall152

By Richard Amoako

Greetings! My name is Richard Dickson Amoako. I am a second year PhD. student in Evaluation, Statistics, and Methodology at the University of Tennessee, Knoxville. My research interests focus on areas such as program evaluation, impact evaluation, higher education assessment, and emerging technologies in evaluation.  

As a lover of technology and technological innovations, I am intrigued by technological advancements in all spheres of our lives. The most recent development is the increased development and improvement of artificial intelligence (AI) and machine learning (ML). As an emerging evaluator, I am interested in learning about the implications of these technologies for evaluation practice.  

Throughout this blog post, I explore the implications of these technologies for evaluation including relevant technologies useful for evaluation, how these technologies can change the conduct of evaluation, the benefits and opportunities for evaluators, as well as the challenges and issues with the use of these emerging technologies in evaluation.  

 

Relevant Emerging Technologies for Evaluation 

Emerging technologies are new and innovative tools, techniques, and platforms that can transform the evaluation profession. These technologies can broadly be categorized into four groups, data collection and management tools, data visualization and reporting tools, data analysis and modeling tools, and digital and mobile tools. Three examples of the most popular emerging technologies relevant to evaluation are artificial intelligence, machine learning, and big data analytics. 

  • Data collection and analysis: AI and ML can help evaluators analyze data faster and more accurately. These technologies can also identify patterns and trends that may not be apparent to the naked eye. Additionally, emerging technologies have also led to new data collection methods, such as crowdsourcing, social media monitoring, and web analytics. These methods provide valuable opportunities for evaluators to access a wider range of data sources and collect more comprehensive and diverse data. 
  • Increased access to data: Social media, mobile devices, and other technologies have made it easier to collect data from a wider range of sources. This can help evaluators gather more diverse perspectives and ideas. 
  • Improved collaboration: Evaluators can collaborate more effectively with the help of video conferencing, online collaboration platforms, and project management software, regardless of where they are located. 
  • Improved visualization: Evaluators can present their findings in a more engaging and understandable way by using emerging technologies like data visualization software and virtual reality. 

 

Challenges and Issues Associated with Emerging Technologies in Evaluation 

While emerging technologies offer many exciting opportunities for evaluators, they also come with challenges. One of the main challenges is keeping up to date with the latest technologies and trends. Evaluators should have a solid understanding of the technologies they use, as well as the limitations and potential biases associated with those technologies. In some cases, emerging technologies can be expensive or require specialized equipment, which can be a barrier for evaluators with limited resources. 

Another challenge is the need to ensure emerging technologies are used ethically and responsibly. As the use of emerging technologies in evaluation becomes more widespread, there is a risk that evaluators may inadvertently compromise the privacy and security of program participants. In addition, they may inadvertently misuse data. To address these challenges, our profession needs to develop clear guidelines and best practices for using these technologies in evaluation. 

To conclude, emerging technologies are revolutionizing the evaluation landscape, opening new opportunities for evaluators to collect, analyze, and use data. With artificial intelligence and machine learning, as well as real-time monitoring and feedback, emerging technologies are changing evaluation and increasing the potential for action-based research. However, as with any advancing technology, there are also challenges to resolve. Evaluators must keep up to date with the latest technologies and develop clear guidelines and best practices. They must also ensure that these technologies are used ethically and responsibly. 

 

Resources 

Adlakha D. (2017). Quantifying the modern city: Emerging technologies and big data for active living research. Frontiers in Public Health, 5, 105. https://doi.org/10.3389/fpubh.2017.00105 

Borgo, R., Micallef, L., Bach, B. McGee , F.,  Lee, B. (2018). Information visualization evaluation using crowdsourcing. STAR – State of The Art Report, 37(7). Available at:  https://www.microsoft.com/en-us/research/uploads/prod/2018/05/InfoVis-Crowdsourcing-CGF2018.pdf 

Dimitriadou, E., & Lanitis, A. A. (2023).  Critical evaluation, challenges, and future perspectives of using artificial intelligence and emerging technologies in smart classrooms. Smart Learn. Environ, 10, 12. https://doi.org/10.1186/s40561-023-00231-3 

Huda, M., Maseleno, A., Atmotiyoso, P., Siregar, M., Ahmad, R., Jasmi, K. A., & Muhamad, N. H. N. (2018). Big data emerging technology: Insights into innovative environment for online learning Resources. International Journal of Emerging Technologies in Learning (iJET), 13(01), pp. 23–36. https://doi.org/10.3991/ijet.v13i01.6990 

Jurafsky, D., & Martin, J. H. (2009). Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition. Prentice Hall. 

World Health Organization. (2016). Monitoring and evaluating digital health interventions: A practical guide to conducting research and assessment. WHO Press. Available at:  https://saluddigital.com/wp-content/uploads/2019/06/WHO.-Monitoring-and-Evaluating-Digital-Health-Interventions.pdf 

Filed Under: Evaluation Methodology Blog

So, You Want to Be a Higher Education Assessment Professional? What Skills and Dispositions are Essential?

November 1, 2023 by jhall152

By Jennifer Ann Morrow, PhD.

What does it take to be a competent higher education assessment professional? What skills and dispositions are needed in order to be successful in this field? I would get asked this question a lot from my students and while many times my go to answer is “it depends”, that answer would not suffice in preparing my students for this career path. So in order to give them a more comprehensive answer to this question I went to the literature. 

Although I have been teaching emerging assessment and evaluation professionals for the past 22 years and at various times coordinating both a Ph.D. and certificate program in Evaluation Statistics and Methodology I didn’t want to rely on just what our curriculum focuses on to answer their question. We educate students with diverse career paths (e.g., assessment professional, evaluator, faculty, data analyst, psychometrician) so our curriculum touches upon skills and dispositions across a variety of careers. Therefore, I delved deeper into the literature to give my students a more focused answer for their chosen career path. 

Guess what I found…. “it depends!”. There was little to no consistency or agreement within our field as to what are the essential competencies needed in order to be competent as a higher education assessment professional. So, depending on who you asked and what source you read the answer was different. While some sources touched upon needed knowledge and skills very few discussed dispositions that were essential to our professional practice. So my curious mind was racing and after some long discussions and reviewing literature with one of my fabulous graduate students, Nikki Christen, we started compiling lists of needed skills and dispositions from the literature. We soon realized that we needed to hear from higher education assessments professionals themselves to figure out what skills and dispositions were needed. So, a new research project was born! We brought on two other fabulous assessment colleagues, Dr. Gina Polychronopolous and Dr. Emilie Clucas Leaderman, and developed a national survey project to assess higher education assessment professionals’ perceptions of needed skills and dispositions in order to be effective in their job. I wanted to be able to give my students a better answer than “it depends!”. 

You can check out our article (https://www.rpajournal.com/dev/wp-content/uploads/2022/03/A-Snapshot-of-Needed-Skills-RPA.pdf) for detailed information on our methodology and results for this project. We had 213 higher education assessment professionals from across the country rate the importance of 92 skills and 52 dispositions for our field. I’ll briefly summarize the results here and then offer my suggestions to those who are interested in this career path. 

 

Summary of Needed Skills 

We found that the most important skills were interpersonal ones! Collaborating with others on assessment, developing collaborative relationships with stakeholders, and working with faculty on assessment projects were the highest rated skills. One participant even stated, “assessment is about people!”. Building relationships, collaboration, facilitation, and communication were all salient themes here. Other skills that were highly rated related to disseminating information. Communicating assessment results to stakeholders, communicating assessment results in writing, and disseminating assessment results were all highly related by higher education assessment professionals. Leadership skills were also deemed highly important by participants. Advocating for the value of assessment, developing a culture of assessment within an organization, facilitating change in an organization using assessment data were all seen as key skills. Project management was also rated as highly important to be competent in this field. Managing time, managing projects, and managing people were highly valued skills by participants. Various aspects of assessment design, developing assessment tools, data management, engaging in ethical assessment were also highly rated. One unexpected finding was that teaching experience was mentioned by a number of assessment professionals as a needed skill in the open-ended responses (Ha, the educator forgot to ask about teaching!). 

 

Summary of Needed Dispositions 

Many dispositions were rated as highly important by our participants. One mentioned, “personally I feel dispositions are more vital than technical skills. You can learn the techniques but without the personality, you will have trouble motivating others!”. Interpersonal dispositions such as collaborative, honest, helpful, inclusive, and support were deemed highly important dispositions to have. Responsiveness was also highly rated. Dispositions like problem solver and adaptable were found to be highly important. Having a consistent work approach was important. Dispositions such as trustworthy, reliable, ethical, analytical, detail oriented, and strategic were highly rated in this category. Expression related dispositions were also seen as important. Being transparent, articulate, and professional were all highly rated. Other themes that emerged from the open-ended responses were flexibility, patience, ‘thick skin’, and ‘it depends’ (seriously, I didn’t even prompt them for that response!).  

 

Next Steps: Starting Your Journey as a Higher Education Assessment Professional 

So now what? Now that you have some idea of what skills and dispositions are needed in order to be successful as a higher education assessment professional, what are your next steps? My advice is threefold: read, engage, and collaborate. Read the latest articles in the leading assessment journals (see list below). Here you will find the latest trends, the leading scholars, and suggestions for all the unanswered questions that still need to be explored in our field. Engage in learning and networking opportunities in our field. Attend the many conferences, webinars and trainings (some are free!), and join a professional organization and get involved. The Association for the Assessment of Learning in Higher Education (AALHE) is one of my homes. They have always been welcoming, and I’ve made great connections by attending events and volunteering. Reach out to others in our field for advice, to discuss research/interests, and possible collaborations. Post a message on the ASSESS listserv asking for advice or to connect with others that have similar research interests. There are many ways to learn more about our field and to get involved…just put yourself out there. Good luck on your journey! 

 

References and Resources 

Christen, N., Morrow, J. A., Polychronopoulos, G. B., & Leaderman, E. C. (2023). What should be in an assessment professionals’ toolkit? Perceptions of need from the field. Intersection: A Journal at the Intersection of Assessment and Learning. https://aalhe.scholasticahq.com/article/57789-what-should-be-in-an-assessment-professionals-toolkit-perceptions-of-need-from-the-field/attachment/123962.pdf 

Gregory, D., & Eckert, E. (2014, June). Assessment essentials: Engaging a new audience (things student affairs personnel should know or learn). Paper presented at the annual Student Affairs Assessment and Research Conference, Columbus, OH. 

Hoffman, J. (2015). Perceptions of assessment competency among new student affairs professionals. Research & Practice in Assessment, 10, 46-62. https://www.rpajournal.com/dev/wp-content/uploads/2015/12/A4.pdf 

Horst, S. J., & Prendergast, C. O. (2020). The Assessment Skills Framework: A taxonomy of assessment knowledge, skills and attitudes. Research & Practice in Assessment, 15(1). https://www.rpajournal.com/dev/wp-content/uploads/2020/05/The-Assessment-Skills-Framework-RPA.pdf 

Janke, K. K., Kelley, K. A., Sweet, B. V., & Kuba, S. E. (2017). Cultivating an assessment head coach: Competencies for the assessment professional. Assessment Update, 29(6). doi:10.1002/au.30113 

Polychronopoulos, G. B., & Clucas Leaderman, E. (2019). Strengths-based assessment practice: Constructing our professional identities through reflection. NILOA Viewpoints. Retrieved from https://www.learningoutcomesassessment.org/wp-content/uploads/2019/08/Viewpoints-Polychronopoulos-Leaderman.pdf 

AALHE: https://www.aalhe.org/  

AEFIS Academy: https://www.aefisacademy.org/global-category/assessment/?global_filter=all  

Assess Listserv: https://www.aalhe.org/assess-listserv 

Assessment and Evaluation in Higher Education: https://www.tandfonline.com/journals/caeh20 

Assessment in Education: Principles, Policy & Practice: https://www.tandfonline.com/loi/caie20 

Assessment Institute: https://assessmentinstitute.iupui.edu/ 

Assessment Update: https://onlinelibrary.wiley.com/journal/15360725 

Educational Assessment, Evaluation, and Accountability: https://www.springer.com/journal/11092 

Emerging Dialogues: https://www.aalhe.org/emerging-dialogues 

Intersection: A Journal at the Intersection of Assessment and Learning: https://www.aalhe.org/intersection 

JMU Higher Education Assessment Specialist Graduate Certificate: https://www.jmu.edu/pce/programs/all/assessment/index.shtml 

Journal of Assessment and Institutional Effectiveness: https://www.psupress.org/journals/jnls_jaie.html 

Journal of Assessment in Higher Education: https://journals.flvc.org/assessment 

Online Free Assessment Course: http://studentaffairsassessment.org/online-open-course 

Practical Assessment, Research, and Evaluation: https://scholarworks.umass.edu/pare/ 

Research & Practice in Assessment: https://www.rpajournal.com/ 

Rider University Higher Education Assessment Certificate: https://www.rider.edu/academics/colleges-schools/college-education-human-services/certificates-endorsements/higher-education-assessment 

Ten Trends in Higher Education Assessment: https://weaveeducation.com/assessment-meta-trends-higher-ed/ 

Weave Assessment Resources: https://weaveeducation.com/assessment-accreditation-webinars-ebooks-guides/?topic=assessment 

 

Filed Under: Evaluation Methodology Blog

So I Like Statistics, Now What?

October 15, 2023 by jhall152

By Jake Working 

Whether you’ve taken a statistics class, recently read a report from your data analyst, or simply want to make data-driven decisions, something about statistics just clicked with you. But what comes next? What can you do with this newfound passion?

I’m Jake Working, a current PhD student in the Evaluation, Statistics, and Methodology at the University of Tennessee, and I had similar questions after my first statistics class in college. In this post, I’ll discuss methods and rationale to improve your statistical skill set and an introduction to the methodology, evaluation, statistics, and assessment (MESA) fields.

Overview

1. Explore Statistics: Methods to improve your statistical skill set

2. Discover Your Motivation: Refine your rationale for statistical application

3. What is MESA? An introduction to the fields

 

Explore Statistics

Now that you have found an interest, keep learning! If you are still in college, consider a statistics minor or simply taking a few courses outside your major. As an engineering student in college, I was able to take additional statistics-related courses, such as business statistics and statistical methods in Six Sigma. Most institutions offer topical statistical-based courses such as business statistics and quality methods, but it is important to consider foundational statistics courses taught in a mathematical environment to have a basic understanding of statistical theory and methodology.

Image Credit: XKCD

Creating a foundational knowledge of statistics does not have to be expensive, though. If you aren’t currently a college student, there are endless opportunities to gain statistical knowledge for free! A popular statistical analysis program, R, is available free and open source. I recommend an interface such as RStudio or BlueSky (both also free and open source) to use with R, and a certification course to get started (such as this one offered by Johns Hopkins). In the manufacturing industry, statistical analysis related to Six Sigma or quality control would be more beneficial, and there are many options to become Six Sigma certified.

 

Discover Your Motivation

Why did you initially enjoy statistics? I was drawn to multiple aspects related to statistical analysis such as data visualization and data-driven decision making which ultimately led me to the MESA field.

At first, I was motivated by statistical reporting and data visualization techniques that allowed complex, but useful, information to be distilled into digestible and easy to understand information. While it may be natural to some, data visualization is a learned and ever-changing process. If you are interested in this area, I recommend checking out Stephanie Evergreen’s Evergreen Data for data visualization checklists, best practices, and online courses!

Most importantly, I enjoyed being able to support any decision I made with data. This motivation allowed me to weave statistical methods for the purpose of data-driven decision making into any role I was working. Data-driven decision making is popular in any field, because it allows you to have substantial rationale and evidence to create progress. If you were like me, I enjoyed the field I was working in, and wanted to formally apply these motivations in my field. Enter the MESA fields.

 

What is MESA?

The interwoven fields of methodology, evaluation, statistics, and assessment (MESA) include a growing number of career opportunities for those who started with an initial passion for statistics. While you likely understand statistics, how do the other fields connect?

Methodology, in this application, relates to the systems (or methods) of gathering information related to a particular problem (Charles, 2019). It is the “how” you gather and address your question or problem. Examples of methodologies include qualitative, quantitative, and mixed methods. The Grad Coach has a great resource on defining research methodology. You can think of statistics and methodology as the tools used to conduct assessments and evaluations, the other areas of MESA.

Evaluation refers to the process of determining the merit, worth, or value of a process or the product of that process (Scriven, 1991, p. 139). One common area within this field is program evaluation, which focuses on the evaluation of program objectives and will lead to decisions regarding the program.

Assessment is often defined as “any effort to gather, analyze, and interpret evidence which describes institutional, divisional, or agency effectiveness (Upcraft & Schuh, 1996, p. 18). The main goal of assessment is to gather information in order to improve performance. Examples of assessment include standardized tests, surveys, homework or exams, or self-reflection (Formative, 2021).

If you’d like to gain an understanding of what type of careers lie within these fields, search for jobs related to: evaluation, assessment, methodologist, data analyst, psycho-metrics, or research analyst.

 

References

Charles, H. (2019). Research Methodology Definition [PowerPoint slides]. SlidePlayer. https://slideplayer.com/slide/13250398/

Formative and Summative Assessments. Yale Poorvu Center for Teaching and Learning. (2021, June 30). Retrieved March 26, 2023, from https://poorvucenter.yale.edu/Formative-Summative-Assessments

Scriven, M. (1991). Evaluation Thesaurus. Sage. https://files.eric.ed.gov/fulltext/ED214952.pdf

Upcraft, M. L., & Schuh, J. H. (1996). Assessment in Student Affairs: A Guide for Practitioners. The Jossey-Bass Higher and Adult Education Series. Jossey-Bass Inc., Publishers, 350 Sansome St., San Francisco, CA 94104.

Filed Under: Evaluation Methodology Blog

Introducing the Blog!

October 5, 2023 by jhall152

From the Faculty of the Evaluation, Statistics, & Methodology PhD Program!

Hello and welcome to our new blog. We are MAD about Methods! As faculty who have been involved in the field of Measurement, Evaluation, Statistics, and Assessment (MESA) for many years, we are excited to introduce you to our new blog: MAD (Meaningful, Action-Driven) with Methods and Measures. Our blog is sponsored by the Evaluation, Statistics, and Methodology program at The University of Tennessee, Knoxville, and our aim is to engage in discussions and enrich scholarly contributions about MESA.

MESA is an interdisciplinary field that involves the application of rigorous quantitative and qualitative methodologies to assess problems in the educational, social, and behavioral sciences. At the core of MESA is the idea of gathering and analyzing data to help policy makers, practitioners, and researchers make informed decisions. The field encompasses a wide range of topics, including educational assessment, program evaluation, psycho-metrics, survey research, qualitative methods, and data science. Through our blog, we hope to provide a platform for scholars and practitioners to share their insights and experiences, and to discuss the latest developments in the field.

Our vision for this blog is to become the go-to place for anyone interested in MESA topics or looking to stay informed about the latest happenings and hot topics in our field. Whether you are a student just starting your journey or an experienced practitioner looking to stay up-to-date with the latest research, we hope that you will find our blog to be a valuable resource.

In addition to providing brief research topics and news, we also hope to use this blog as an opportunity to explore the skills, knowledge, and dispositions required to be successful in the MESA field. We will be highlighting the work of scholars and practitioners who are making a difference in the field and discussing the competencies that have enabled them to achieve success. Each blog will also contain a list of resources on the topic, for readers who are interested in exploring the topic in greater detail.

The Evaluation, Statistics, and Methodology (ESM) program at the University of Tennessee is committed to providing students with the skills and knowledge they need to succeed in the MESA field. We offer two graduate programs, including a residential PhD in Evaluation, Statistics, and Methodology, as well as a completely online MS in Education with a concentration in Evaluation Methodology. Through our blog, we hope to provide emerging and experienced professionals with the insights and guidance they need to excel in their chosen discipline.

We hope you find our MAD blog a valuable platform to come together and discuss the latest developments in the field of evaluation, assessment, and research methodology. We hope you will join us on this journey and go MAD with methods with us!

On behalf of the ESM Faculty:

Louis Rocconi (Pictured), Jennifer Morrow, Leia Cain, Fatima Zahra

Filed Under: Evaluation Methodology Blog

  • « Previous Page
  • 1
  • 2

Recent Posts

  • How Do I Critically Consume Quantitative Research?
  • Engaging Students in Online, Asynchronous Courses: Strategies for Success
  • Careers in Program Evaluation: Finding and Applying for a Job as a Program Evaluator
  • Brian Mells Recognized as Field Award Recipient
  • How My Dissertation Came to be through ESM’s Support and Guidance

Recent Comments

No comments to show.

College of Arts & Sciences

117 Natalie L. Haslam Music Center
1741 Volunteer Blvd.
Knoxville TN 37996-2600

Phone: 865-974-3241

Archives

  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • October 2023
  • May 2022
  • September 2021
  • August 2021
  • February 2021
  • September 2020
  • August 2020
  • June 2020
  • May 2020
  • March 2020
  • February 2020
  • November 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • August 2018
  • June 2018
  • May 2018
  • April 2018
  • February 2018
  • December 2017
  • October 2017
  • August 2017

Categories

  • Accolades
  • CEL
  • CSP
  • EDAM
  • Evaluation Methodology Blog
  • Graduate Spotlights
  • HEAM
  • Leadership Studies News
  • News
  • PERC
  • Presentations
  • Publications
  • Research
  • Uncategorized

Copyright © 2025 · University of Tennessee, Knoxville WDS Genesis Child on Genesis Framework · WordPress · Log in

Educational Leadership and Policy Studies

325 Bailey Education Complex
Knoxville, Tennessee 37996

Phone: 865-974-2214
Fax: 865.974.6146

The University of Tennessee, Knoxville
Knoxville, Tennessee 37996
865-974-1000

The flagship campus of the University of Tennessee System and partner in the Tennessee Transfer Pathway.

ADA Privacy Safety Title IX