PLUS

Learning Experience Researcher
2022.5 - current

PLUS is software designed to enhance learning effectiveness for K-12 students, especially marginalized studnets, by "squaring" the power of both human and computer tutoring. Human tutors are provided with research-driven training and student learning data to support both academic and social-emotional growth of students. My job as a Learning Experience Researcher is to evaluate the training lessons that my colleagues designed for tutors, and propose instructional iteration suggestions. Another focus is to explore new possibilities of utilizing techniques such as machine learning to further personalize the learning experience.
Click here to view the PLUS website!

Overview
Internship & full-time job supervied by Prof. Ken Koedinger
Oops! The image is gone...
Image from https://tutors.plus/
Learning Evaluation of Training Lessons

Based on previous research, we've established the SMART framework representing the competencies that are necessary for effective tutoring [3]. Under each topic, PLUS offers training lessons helping tutors build their skills [2].
The scenario-based lessons [1, 4] are short (> 15 mins) and self-paced, focusing on student socio-motivational support. Each lesson guides the learner through the predict-explain-observe-explain structure applying the "learn by doing" approach [6, 8, 10], first asking them to respond to a student encountering difficulty in a imaginary scenario. Then there are research-driven recommendations for the learners, and the learners need to explain their reasoning behind their observations. Lastly, the same process will repeat with another scenario to determine the knowledge transfer.
In each scenario, there is a constructed response and a selected response, and the options in the latter are learner-sourced [7] from the past constructed responses or real chat logs between tutors and students. The learner-sourcing approach is applied for better capturing learner's misconceptions and increasing authenticity of the scenarios.

Oops! The image is gone...Oops! The image is gone...

In the work we reported in the ariticle submitted to the 13th International Learning Analytics and Knowledge Conference, we released 3 pivot lessons Responding to Students' Error, Learning What Students Know, and Giving Effective Praise, and recruited tutors from a nonprofit online tutoring organization as the learners. I contributed to the coding of their constrcuted responses, and after combining the selected response and constructed response scores of both scenarios in each lesson, we saw a significant learning gain.
However, we also saw some room for improvement of the instructions and assessments. Generally speaking, the selected responses are too easy for the learners, while some of them didn't cover all the common misconceptions shown in the constructed responses. The design is always an iterative process, and with the evaluation results we were able to support the iteration with evidence. In addition, we also considered enhancing the learning experience through giving learners more feedback [11]. Given that manually scoring and generating targeted feedback for constructed responses are labor-intensive, we are now exploring possibilities of auto grading tools using natural language processing techniques [5, 9].

Publications
Title
Status
Main Contribution
When the Tutor Becomes the Student: Design and Evaluation of Efficient Scenario-based Lessons for Tutors
Submitted to the 13th International Learning Analytics and Knowledge Conference
As the second author, I did literature review about related work (i.e. scenario-based learning), helped with establishing the coding schema and coded the constrcuted responses from the pivot lessons, shared the writing of the method section, and wrote about future work in the discussion.
Response to Today’s Tutoring Challenges: A Human and AI-Driven Support for Tutors
Accepted by the International Journal of Advanced Corporate Learning
As the co-first author, I summerized the Personalized Learning Squared solutions inclucing tutor training, student version, and tutor recruitment in response to today's challenge of lack of tutor resources and the change of educational situations due to the pandemic.
A Meta-Analytic Investigation of the Impact of Middle School STEM Education: Where Are All the Students of Color?
Submitted to the International Journal of STEM Education
As the third author, I did literature review about practical recommendations for increasing the engagement and motivation of marginalized students in STEM education.
Automated Short Answer Scoring of Scenario-based Lessons for Tutors
In progress
Building on the learning evaluation of the pivot lessons, we want to explore the application automated short answer scoring, and hopefully find a way to implement auto-feedback to enhance the learning effectiveness of the training lessons.
Team Contribution

1. Alattar, R. (2019). The effectiveness of using scenario-based learning strategy in developing EFL eleventh graders' speaking and prospective thinking skills. The Islamic University of Gaza, Palestine.
2. Chine, D. R., Chhabra, P., Adeniran, A., Gupta, S., & Koedinger, K. R. (2022, June). Development of Scenario-based Mentor Lessons: An Iterative Design Process for Training at Scale. In Proceedings of the Ninth ACM Conference on Learning@ Scale (pp. 469-471).
3. Chhabra, P., Chine, D. R., Adeniran, A., Gupta, S., & Koedinger, K. R. (June, 2022). An Evaluation of Perceptions Regarding Mentor Competencies for Technology-based Personalized Learning. In E. Langran (Ed.), Proceedings of Society for Information Technology & Teacher Education International Conference. (pp. 1812-1817). San Diego, CA: Association for the Advancement of Computing in Education (AACE). 
4. Clark, R. (2009). Accelerating expertise with scenario-based learning. Learning Blueprint. Merrifield, VA: American Society for Teaching and Development, 10.
5. Funayama, H., Sato, T., Matsubayashi, Y., Mizumoto, T., Suzuki, J., & Inui, K. (2022). Balancing Cost and Quality: An Exploration of Human-in-the-Loop Frameworks for Automated Short Answer Scoring. In International Conference on Artificial Intelligence in Education (pp. 465-476). Springer, Cham.
6. Gibbs, G. (1988). Learning by doing: A guide to teaching and learning methods. Further Education Unit.
7. Kim, J. (2015). Learnersourcing: improving learning with collective learner activity (Doctoral dissertation, Massachusetts Institute of Technology).
8. Koedinger, K. R., Kim, J., Jia, J. Z., McLaughlin, E. A., & Bier, N. L. (2015, March). Learning is not a spectator sport: Doing is better than watching for learning from a MOOC. In Proceedings of the second (2015) ACM conference on learning@ scale (pp. 111-120).
9. Mello, R. F., Neto, R., Fiorentino, G., Alves, G., Arêdes, V., Silva, J. V. G. F., ... & Gašević, D. (2022). Enhancing Instructors’ Capability to Assess Open-Response Using Natural Language Processing and Learning Analytics. In European Conference on Technology Enhanced Learning (pp. 102-115). Springer, Cham.
10. Schank, R. C., Berman, T. R., & Macpherson, K. A. (1999). Learning by doing. Instructional-design theories and models: A new paradigm of instructional theory, 2(2), 161-181.
11. Torres, D., Pulukuri, S., & Abrams, B. (2022). Embedded Questions and Targeted Feedback Transform Passive Educational Videos into Effective Active Learning Tools. Journal of Chemical Education, 99(7), 2738-2742.

Back to Top ↑