arizona state university DARS student guide
Arizona State University’s Degree Audit Reporting System (DARS) is a tool used by students to track their progress towards graduation. This is a complicated tool that advisors spend too much time explaining to students. My team and I created an interactive video guide that assesses their knowledge, explains how to interpret their audit and asks a series of questions to provide custom tips based on their primary academic goal and student type. After exploring the guide, students were more prepared to talk with their advisor.
My roles & Process
user recruitment + discovery
I created a screener survey and recruited 6 freshman, senior and transfer students for 1:1 phone interviews. During the 20 minute discussions, I followed a script guide to understand their DARS experience, frustrations, appreciations and information interpretation.
This helped me to understand where students need the most help and what content is necessary to highlight to improve their experience and understanding.
internal ideation + design
We discussed the findings internally and agreed we needed to create an experience that felt customized to the student’s needs based on their academic status, while highlighting important information in an interactive and digestible way.
After a group white boarding session, I created lo fidelity wireframes in LucidChart to get a feel for the flow and content needs.
content creation
In addition to the feedback I got from students and stakeholders, I did extensive research on the DARS & Major Map tools, and their existing documentation, to develop a content grid. I wrote 10 important things to know, as well as video talking points and advisor tips for each academic segment. This was presented to students as a personalized “syllabus for success” within the experience.
A student’s degree audit can look very different depending on their current standing. It’s also very important the audit is interpreted correctly as misinterpretation could impact the student’s academic goals and/or graduation. I wanted to make sure I provided a good foundation of knowledge as well as conversation tips to set them up for a productive meeting with their advisor.
Usability testing
After the pilot experience was built, I created a recruitment and testing plan to determine if students are able to navigate the experience intuitively, are satisfied with the level of information provided and leave feeling ready to have a meaningful conversation with their advisor. I was granted access to ASU’s engineering student Slack channel and started scheduling 1:1 interviews. During a 1 week period, I utilized Zoom to conduct four 30 min remote sessions. I analyzed the findings and created a formal report to share with clients, including methodology & recommendations for improvement.
findings
Overall, students were very satisfied with the experience and agree it will be super helpful for future students to have access to this information all in one place so they can have a more productive conversation with their advisor.
Two major findings were observed. 100% of users missed the 2nd video and 75% of students had trouble identifying with only one primary academic goal. I created an A/B test to help increase engagement with the 2nd video and recommended the experience be updated to allow users to go back and choose a different academic goal to view more tips.
Additionally, I utilized Inspectlet to analyze mobile & desktop session recordings, rage clicks and engagement heat maps. Google Analytics was used to monitor drop off rates, primary referral sources, time spent and page views.