ResEdChat Ep 55: Natasha Monteith on Effective Student Affairs Assessment and Evaluation

Dustin welcomes Natasha, one of our faithful bloggers, to the show this week. She shares her perspective on how student affairs divisions need to approach assessment to better manage their teams as well as support their students by leveraging data informed practices.

Guests:

  • Natasha Monteith, PhD Student at UNC – Greensboro

Listen to the Podcast:

Watch the Video:

Read the Transcript:

Dustin Ramsdell:
Welcome back to Roompact’s ResEdChat Podcast, I’m your host, Dustin Ramsdell. This podcast series features a variety of topics of interest to hired professionals who work in and with and around college and university housing. So, this is another episode in our ongoing series this season, talking to the current blogging team for Roompact, they cycle through a lot of great people, bringing in perspectives on a variety of topics. So, as we did in I think right about this time last year, we are talking with all the bloggers. So, we will start out as we always do, Natasha, if you want to introduce yourself and your professional background, and then we will go from there.

Natasha Monteith:
Yeah. Well, hello, it’s great to meet you. My name’s Natasha Monteith, I use she/her pronouns, and I’m a current third year graduate PhD student at the University of North Carolina at Greensboro, in the educational research methodology program, with a focus in program evaluation, which is a whole lot of fancy stuff to say, I really like assessment and evaluation, and I want to do it for forever. A little bit about me, I’m originally from Michigan, went to Central Michigan University, where I was an RA and really involved in first generation college access programs.
I actually proposed a first generation living learning community as my capstone honors project, before graduating, and then I moved on to Boston College where I did my graduate degree, the first one, in higher education administration. And I served as a graduate resident director there, where I got my first exposures to assessment. I did a junior year student needs assessment, and I worked on a late night programming model, and explored whether or not the late night programming model actually impacted alcohol intake over the course of time. Really cool stuff, fell in love with it. Moved on to Georgia Southern, where I was a resident director for two years. Then I moved to UNCG during the pandemic… It was a time.
And got to work there for a little shy of two years, before moving to a remote L&D role because I realized I could not do the PhD program and an entry level housing position at the same time. So, I’m very lucky, I’m now doing my PhD program full-time. I have a graduate assistantship where I’m getting to work with evaluation work in the K12 setting, and also in the academic side of higher education. So, excited and happy to be talking about the student affairs side for a little bit.

Dustin Ramsdell:
Yeah, and I think there certainly will be I think some kind of crosspollination or applicability around just higher ed writ large. But yeah, focusing majority of our conversation around assessment in student affairs, in particular. So, we might use interchangeable terms here, but I know that’s your background, your interest. And it’s kind of funny, I think most of the folks on the blogging teams here, like we have a current master’s student, and folks in PhD programs and stuff, so I think, we can talk more about that, I guess, of what attracted you to writing at this time.
But it seems like at least when you’re consuming all of this content, and classes and stuff is a good way to get it out and everything. But, to start from a common base of understanding, assessment I think is one of those terms, certainly right now I feel like the one is digital transformation… I’ve just realized that there isn’t a common definition of that. Assessment, I think we’ve got a little bit more clear on, collectively as a field. So, I don’t know if it’s your definition, or a more textbook one, but how would you define assessment in student affairs, in particular?

Natasha Monteith:
I’m going to jump on a soapbox real quick. We’re literally less than three minutes in, and I’m already going to jump on one. But, I think in student affairs what we refer to as assessment, the remainder of professional… Pause, restart. So, I think something to start off with is thinking that in student affairs we call this work assessment, in almost every other field it is called evaluation. So, if I’m talking and I use the word evaluation, it means student affairs assessment. But usually when people hear assessment outside of student affairs, they are thinking the GRE, the SAT, language assessment… A lot of standardized measures is what assessment means, outside of student affairs. But inside I think of it as action and use-based research practices, without the intention of publishing.
So, there’s this guy named Michael Quinn Patton, who basically founded this idea of use-based evaluation. That, if you’re going to take the time to explore a program, understand what it’s doing, understand how to make it better, you need to be thinking about how you’re going to use that as you’re going. And I think right now student affairs assessment is really situated in that type of evaluation approach. But that is something I’ll talk a little bit about later probably, is that if you’re looking for resources, don’t look up assessment, look up evaluation, and you’re going to find so much more that’s relevant and helpful for what you are trying to do.

Dustin Ramsdell:
Yeah, that’s a good bit of advice there, to expand. Because I think that’s the idea, is that student affairs assessment, just over the… I’ve worked professionally in higher education, and a lot of it in student support and everything, for nearly 10 years, and I feel like student affairs assessment is its own little pocket community that is existing within the broader evaluation, education evaluation, probably, community. So, if you feel like you’re not being well-served by the [inaudible 00:05:44] folks who use that terminology and everything, it’s because it is… I don’t know. I’m sure the people who prefer that term or whatever have a good reason for it. But that’s certainly good, where you can expand your horizon to get a lot more insight and everything. Yeah, that idea of, in higher education you’re offering resources, doing programming, or a variety of different things, you want to know how much is it being used, by whom, what do they think of it, and how can we adapt and evolve what we’re doing?
And I would even think just with some conversations I’m having, figure out what maybe we need to start or stop doing because it’s just not serving our constituents well, as institutions evolve, and having residential students, hybrid students, commuter students, and online students, all these things in adult learners. So, yeah, if over the past 10 years I feel like this work has evolved and gotten more of a light shone on it, I feel like the importance has only heightened. But from your point of view, what makes this such an important practice in student affairs right now?

Natasha Monteith:
Yeah, I think first is funding. Right now, we are seeing enrollments drop across campuses, we are seeing state budgets get cut, we’re seeing federal budgets get cut. Money is harder and harder to come by, and when you have good data to back up why you deserve the money you’re asking for, than you’ve got a better argument. And so, getting to be able to go up and advocate at a state level, at a federal level, you need to be able to tell that story in a language that transcends the campus. Because what’s happening on college campus is amazing, and it doesn’t necessarily mean that it’s something that… So, I’m based in North Carolina, the folks in Raleigh might not know what’s happening on a college campus, and might not be able to understand the story, if you’re not talking to them in data.
Outside of that, I also think assessment practices and evaluation practices let you tell your story, but also your student’s story, so I think about it from a recruitment standpoint, that if you have really solid qualitative stories you can pull out and talk about, that’s helpful from a recruitment standpoint. So, if there’s fewer people who are going into college, if you can go ahead and sell why your campus has a better experience, or a different experience, or might be serving the experience that student wants, awesome. Thinking about data backed decision making, this is a big area of interest for me. So, my research interest is in data literacy, so how well do folks actually understand data, and can use it to make decisions. I think that we need to really look at what does data literacy look like in student affairs, p.s., if you’re watching this, please don’t start researching that yet, you’re going to take my dissertation. But I think that is a big conversation.
Because we keep talking about how important it is to make data backed decisions, but we really haven’t asked ourselves, are we in a place to be making data backed decisions? And you already mentioned this a little bit too, with program improvement, making sure that what we’re offering is actually going to do what we say it’s going to do, are students learning the things we say they’re going to be learning in these experiences? What is the impact after an experience? So, you spend $15,000 on an amazing retreat, what is the impact of that a year later? Obviously, there’s all sorts of things that are going to end up making that really hard to look at on a one-to-one scale, but thinking about the impact of these really unique experiences that are being offered. And I think something else that’s a big movement that’s happening in student affairs assessment right now, is also thinking about equity based assessment.
And so, thinking about who are things working for, who aren’t they working for, and why? There’s some really great evaluators who have been looking at this since the 70s, and have created, it’s called a culturally responsive and an equitable evaluation approach, and they offer some really helpful frameworks to think about how are we building programs that are actually going to assist the folks that we’re trying to build them to assist. So, I think, overall, the reason you do assessment is to try to make the thing the best it can be, and to be able to make an argument for why it should stay.

Dustin Ramsdell:
Yeah, those are really powerful points and I think there’s just so many nuances to each of them, where it is for your own sake, again, for what you’re doing to keep doing it and doing it better and to serve your own office and all of that, but like you said, on the funding piece, it’s like that’s obviously for self-sufficiency of just keep getting funding for what you’re doing, but then holistically for the institution, they can keep getting the funding for the organization at large. But I was even thinking for alumni, if they either they donated to something, they want to know what’s happening with it, or if they know that they’re engaging in an organization that has a culture of transparency and effectiveness and all of that, they’re going to know, yeah, if I’m chipping in for this new center for academic success, this is a place that cares about results, measures them, shares them, and all that, and I think that that’s just continued to be a new area to try to tap to get support for new endeavors and all of that.
So, I think, yeah, evaluating the work that you’re doing, serves the work that you’re doing, but also can serve so many other stakeholders and constituents and other purposes, where, like you said, it’s a prospective student, your institutional leaders, for accreditation and all these other things. So, it’s baking in that culture, and trying to do it effectively and efficiently, because I think that’s part of where my mind goes too, of, and obviously folks like you writing on this and thinking about it a lot, it’s just the act of doing the evaluating. It’s like, well, how do we do that effectively? Because I think most people would understand, it’s like, oh yeah, this is important, I get it.
But they may not be going about it in a way that feels efficient or effective and everything, so I think that’s why it’s an area where I feel like there’s always a good opportunity for discourse, and show and tell, and all that, because I imagine, and I’ll sort of present this to you, I guess, where a lot of people groan when they hear this topic or something is because it feels like it’s a tedious or agonizing process to go through it, but I feel like it’s because maybe they’re just not going about in the most effective way.

Natasha Monteith:
That’s the biggest point of feedback I’ve heard every time I do student affairs assessment work, is this is one more thing. Student affairs folks are busy. And so, I think it’s one of the biggest places we still have to grow is, like you said, how do we bake this in? How do we make this be something that is so ingrained in what we do and how we do it that it’s not this other survey we have to make, it’s not this other focus group we have to do, because we’ve already built it into the plan from the get-go. So again, I’ll throw out to Michael Quin Patton, the whole premise of use based evaluation is you are thinking about how you’re going to use it from the get-go, and that allows you to think about every step along the way, how are you going to look at what you’re already doing and just add in assessment moments or evaluation moments throughout it.
So, you already, let’s say you’re running a residential curriculum, right? I’m going to shout out to Erin McFerrin who works at Georgia Southern University. She used to have us do an end of the year reflection time called My Hall Story. And it was a beautiful moment of assessment. Because it was a time that was already baked into our curriculum, we were all going to do it, and what we all did was have students come out and basically gave feedback to housing and residents life, about what was the most impactful moment you had in the last year. What is something we could be doing better? And it wasn’t something that ended up being this big, oh my gosh, I need to make this survey, I have to do all this stuff. It was already prepared for us. There was one year we did postcards, I remember, they were already printed out for us, we just had to take them and do them.
And so, I think about things like that. Where, if you’re working in housing and residence life, how can you build these things in, where the students might not even realize they’re being a part of this, because it just feels so natural? And there’s still a place, there’s this place for Skyfactor surveys, there’s a place for these big large scale, we’re going to take this and be able to compare from campus to campus, there’s a place for that. And there’s so much room for us to go ahead and collect really meaningful data in our day-to-day, that lets us make everything better as we’re going.

Dustin Ramsdell:
I’m curious how you view digital tools playing into this, because I think that’s probably a place where there’s been ascendancy of appetite, familiarity, and prevalence of various digital tools that can help serve that idea of baking in day-to-day the ability to track how students are using a resource, or to solicit feedback, or do those comparisons and stuff. Do you have any point of view on that, of just how digital tools play into this whole equation?

Natasha Monteith:
Yeah, I think Roompact… Obviously, I’m a little biased right now, I write for them, we’re on their podcast. But I think they have a lot of really cool and special tools that you can use. So, I think, if you already have the platform, use it. Other things that I think I’ve been really, really surprised and happy to see is, you have Qualtrics, which you can do some really cool survey stuff, quantitative, there have been some really amazing qualitative tools that have been coming out. So I love, it’s called ATLAS.ti, and it is a space where you could pull up all of your qualitative data, and you can code it in there, you can theme it in there, you can theme it across all of these different platforms, and you can make some really, really impactful patterns pop up that you might not see otherwise.
So, you have, let’s go ahead and say you have a form in Roompact, that’s open to students at all points in time, and they’re always welcome to drop in whatever feedback it is that they have. Y’all go, this is really awesome we have this, we’re going to review it once a month. Because you work in housing and residence life, that’s how much time you have, you can do this once a month and give it actual time. Every month, you take that feedback and you upload it into Atlas.ti and code it. So that way it’s there, you don’t have to come back to it. But maybe at the end of every semester you are able to look at that whole code book and see what’s come up multiple times across multiple people. It helps you realize this is what’s important to our students, and it’s maybe not just a one-off comment, we actually saw this come up 10 times.
Maybe this is something we need to put our time and energy towards to try to fix or try to address. So, I think a lot of campuses have gotten really good at baking it in, it’s just they haven’t gotten to the point where they know how to do something with the data they have. Folks drown in data. There is so much data in student affairs, especially in university housing, but it’s what do you do with it? And so, I think it’s a both/and situation. Realize where you’ve already probably baked it in, without realizing you baked it in, highlight it, now, figure out how are you going to analyze it, how are you going to make meaning of it, and what are you going to do with it?
So, I think it’s finding tools that also help you make meaning. So, I’m a big fan of Qualtrics, big fan of ATLAS.ti, and I think both of them also have a really, really great security platform. So, I get really nervous about, please don’t put your stuff up on Google Docs, don’t house your data in Google Suite. Don’t do it, please. Please put it on a secure server. So, both of those have really secure servers that make sure you’re following ethics.

Dustin Ramsdell:
Yeah, and that’s a good point, because part of [inaudible 00:18:49] your work and just good advice, which is what we’re going to segue into next. It’s human nature, in a vacuum, we will some way somehow figure out a solution to what our problem is. So, students often if they’re registering for classes or something, just go to their roommate or their floor mate or their friend or whatever, and asking about how to register, what to register for, and all those sort of things.
So, they’re just doing that in a vacuum, if they can’t find the resources they need, and that’s the idea of, if somebody’s like, oh, we need to get feedback or do this, that, the other thing, it’s like, I don’t know how to best do this, I’ll just make a Google form or whatever, and just do that. But it’s like, when hopefully someone knows better, they will do better, and I think just knowing that it should not be a long-term solution to gathering student input and feedback and everything. So, are there other bits of advice, if it’s just tools, tips, tricks, strategy, whatever comes to mind, whatever you’re willing to share, what advice do you have for folks trying to improve their assessment efforts?

Natasha Monteith:
I’m a big fan of backward design when it comes to assessment, thinking about, what do you want to know by the end? Setting really solid good questions from the get go, and having your plan of, regardless of what the answers are, how are you going to share that at the end? Because I think that’s sometimes what I see happen, is departments will go through large scale data collection, get all this information from students, from faculty, from other staff, and then the things come forward that maybe they don’t like, and therefore that report doesn’t go to everybody. Which, I can understand evaluation and assessment are very political activities, however, I think we need to have a bit of a moment of realizing that it’s going to be okay if we also share we’re not always going to be the best. If we’re striving for continuous improvement, there has to be things we need to improve on. And it’s okay to say we need to improve on them, especially if we can provide a plan of this is how we’re going to do it.
So, I think planning how you’re going to share your information from the beginning is really helpful, and it also allows you to have a level of accountability to everybody so that way you can’t backtrack on that, or if you do, it becomes little bit more sticky. But having a plan of this is how we’re going to share information from the get-go. And then, I think one of the big pieces is realizing that if you are looking for tools, as I said before, don’t search for assessment tools because you’re going to just find a ton of stuff about language assessment, it’s just everywhere. So, searching for program evaluation, you’re going to find tools and things that are going to be a lot more helpful for you, than if you’re just looking up assessment.
Especially if you’re trying to self-teach. One of my favorite books that I tell folks, if they’re just starting this out, and they’re not really sure what in the world folks are even saying, so you’ve been listening to this podcast and you’re like, she keeps saying qual, she keeps saying quant, people keep telling me formative, summative… It’s okay, we all got to start somewhere. If you’re at that phase, Zena O’Leary has a book called The Essential Guide to Doing your Research Project, and it’s literally a step by step, let me talk you through this as if we’re friends, let me help you through how to do this, guide on how to do a research project. Which, it’s not going to be as intense, you can pick and choose which parts you participate in, because you’re not doing research, you’re not going to be publishing what you’re finding more than likely, but the tools in it are really, really helpful.
Some other ones, so, there’s a book by Funnell and Rogers, I think it’s called Purposeful Program Theory… So, if you’re somebody who’s more like, I really love student development theory, if you are one of my development theory friends and you want to understand the theory behind evaluation and assessment, super helpful book. As somebody who has always liked assessment and evaluation but has struggled so much in understanding, why do we do it this way? Help me understand why do we decide to ask these questions the way we’re asking them. Why is it important that I put demographics at the front compared to at the back? Funnell and Rogers, phenomenal, they also go through a thing called a logic model, which is something I would say if you are listening and you are a director level, look into logic models, and thinking about making a logic model for your department. It’s basically a visualization of what you do, what you have, and where in the world are you going. And it helps you stay on track.
So, I could go onto this for forever, but I’m going to stop there for right now. So, lots of really good tools. I’ll send you a list, what I would recommend as quintessential reading lists of things that are helpful. But I think if there’s one skill that I’d tell folks, hey, it’d be really helpful to hone in here, it’s making your assessment plan. And that assessment plan is going to include that sharing plan as well, it’ll include, how are you going to analyze the data when you get it? Because you should be thinking about that as you’re building up however you’re going to collect data. Also have a question mark of, should you be collecting data? Again, you’re probably drowning in it. Do you need to make another survey, or do you already have something that’s really, really close to what you need? You probably don’t need that other survey, you probably have something you could use.

Dustin Ramsdell:
Well, and that makes me, again, the do you necessarily need to add a new thing to your plate, or can you even subtract, is this like, okay, maybe we’re surveying too much, we’re not getting enough response, or getting detailed responses? And something as the last point, and then we’ll get to just our final couple of questions, is a quick follow-up. Because the idea of being so data rich, but not analyzing it and getting insights from it I think is absolutely the case. So then, if the majority of what you’re relying on, I think it absolutely needs to be part of the mix. But if you’re overreliant on self-served survey responses, my point of view is, you need that input and feedback, but being overly reliant on that, a lot of people will often say what they think you want to hear, or they’ll say one thing and actually do another.
So sometimes you do need to actually look at how’s the resource being utilized? What’s the activity in this digital platform? Or something. Make sure that that’s in the mix, because that’s a bit more telling of what people are actually doing or not doing than what they claim when they’re responding to a survey, and that can also be, again, a low response rate, versus if it’s, okay, this is a platform all of our residential students should be using, and you can get your arms around things in a more comprehensive sense. So, I’m curious your perspective on that, of how you try to have a mix of modalities of which you’re soliciting data.

Natasha Monteith:
Yeah. Literally, I’m giggling to myself a little bit, because this is what we’re covering right now in my survey class, of when you’re asking for factual data versus opinion-based data. And if you’re asking for factual data, is there a different way to do that? So, kind of what you were talking about. It’s looking at how many people are actually coming in. Something I really enjoyed doing when I was still working in housing was, I don’t want to take a guess at what policies we might need to educate folks around, I’m going to go pull the incident reports that have come through in my building in the last three months and see what did I charge these folks with in… We had used Maxient. But it’s like, what did we charge these folks with? Because that’s what we need to be talking to people about.
Obviously there’s a gap here of some kind, whether it’s folks don’t know the policy exists, or maybe people are making some decisions and maybe we need to do some alcohol education. You’ve got information right in front of you that you don’t need to send a survey about, you already got it. One of the assistant directors I worked with at UNCG, Chris Gregory, during the COVID pandemic had done some really cool, following tap access records, to see, in the middle of this pandemic, are students moving around as much? Is this potentially why we aren’t seeing folks in the hallways? Are folks even leaving the buildings and coming back? Are there certain times of the day that we’re seeing higher traffic and not seeing higher traffic?
So, I think finding factual data spots is really helpful, I think unfortunately so much of our stuff right now is built more for that opinion-based piece. There’s some really cool research out there right now in industrial organizational psychology, saying that, folks, self-reporting is actually really, really close to the true value, if it was reviewed by a supervisor. Do I think that’s going to necessarily transition down to an 18-year-old on a college campus, who’s never been away from home before? I don’t know. But there’s some really interesting conversations about are self-reported measures valid and reliable? And some of the research right now is saying, yes, it actually is. Which I think might also flip a lot of our understanding on its head. I remember when I was reading it, I was like, this feels odd and takes everything I know and has me think about it in new ways. So, I’m interested to see how that continues as they continue exploring that area.

Dustin Ramsdell:
Yeah, that’s helpful, just again, to reinforce that it has its place, when done well, for the right information it can be very truthful and relevant and useful. And yeah, there can be different dynamics of, how are you asking a question? When are you asking it? And then, that idea of, I’m sure it would be maybe a little bit under reported if you’re asking a bunch of 18 year olds, hey, do you drink? How much do you drink all the time? They’re probably going to under-report it, be like, I don’t know, maybe a little bit, sometimes, or something, versus being exactly it. Or I can imagine sometimes asking… And again, this could be relevant if that’s what you’re looking for. But overconfidence maybe, about comfort with technology or different things, where it’s like, hey, well these are tools that you’ve never actually used before, so [inaudible 00:29:40], oh, I can learn new technology tools really easy, but then you’re actually seeing that, wow, help desk tickets from first year students have been spiking about…
And that would be interesting things, where it’s like, that again, could be the insight, is that well, students come in a little overconfident, so we do need to try to address that. But then that idea that students might under-report their drinking, and it’s like the incident cases are higher than they’ve ever been for alcohol consumption, or possession, or whatever, so these people say that they barely ever drink or whatever, but the incidents say otherwise. But it could also be worse, like, well, actually not that many people do drink, and it is just like a few people keep getting cut, they will not get the message here. So yeah, you have to try to have that sort of diverse mosaic of data, because I think if you were completely reliant on incidents, or completely reliant on opinions and self-serve surveys, or whatever, you might not be completely missing the mark, but it can help you get that much closer to the bullseye there. So, yeah, I was just curious your perspectives on that.

Natasha Monteith:
You’re good. And what you just said sparked something that I’m like, oh, I wanted to bring this up today. Because I think you’re right, folks are going to give different answers depending on who they’re talking to. And something I have not seen yet, and I hope there’s a department out there doing this, something I think would be really, really helpful is if a department decides they’re going to do some kind of qualitative data collection, whether that’s a focus group or an interview, I think it’d be really impactful to hire in undergraduate research assistants. The impact of having somebody who’s maybe in their junior year, who has gone through your background checks, has all that done, doing interviews with your residents, you’re going to get a different level of, in my opinion, transparency, than if it’s somebody like me, who, far removed, viewed potentially as an authority figure, you’re going to get different information.
And that falls back into this idea of culturally responsive and equitable evaluation. There’s this idea of… Oh, cultural guides, that’s what they’re called. So, in this framework, there’s this idea of cultural guides, who are people you hire to help you navigate the culture of the folks you’re trying to understand. Because you can acknowledge, I’m not a part of that group, and therefore if I come in, I’m inherently changing the dynamic, and I’m not going to get the best data possible. So, let me hire in somebody from that group, pay them for their labor, and get information that’s going to help us make better decisions. That’s something I’ve had as, just a chat with way too many student affairs folks. I’m like, I don’t see why we’re not doing this. I’m sure there’s a reason, but if there isn’t, I hope somebody can take that and run with it. Because I think you’ll get a lot of really cool, very, very vulnerable information when you let students talk to each other.

Dustin Ramsdell:
Yeah, a well-trained, prepared peer, I think, yeah could get some really amazing insights. But yeah, this is a question that we ask all of our blogging team, because I think it could be just interesting ways that folks have found their way to the team this year. So, what attracted you to writing for Roompact?

Natasha Monteith:
Yeah. So, I told Paul in my interview, I was like, I am trying to get into the habit of writing before this dissertation hits, and I am now forced to get into the habit of writing. That was one of my big pushes. I was like, I was looking for something like this already. However, I got really excited when Roompact posted, because I’m not in a higher education or student affairs grad program, I weasel it in to a lot of my classwork, because I’m passionate about it and I know I want to eventually come back, but I hadn’t really had the opportunity to draw those explicit connections. So, being able, every month, to sit down and say, this is the stuff I’ve been learning for the last three years, let me apply it to the field I love. And let me give folks what I hope are really, really helpful step-by-step, this is how you do it, this is what it looks like… Let me give you tools that you can come back to year after year.
I think about, I was really happy to be able to do that training assessment… series, that’s the word I’m looking for. So, I was really happy to do that training assessment series, because I thought it’d be really helpful for folks to even send that to somebody during onboarding. So, you just got added on to the student staff training committee, hey, we’re going to send you this, this is how we think about assessment in here, this is a part of your onboarding now. My hope was to have it be something that outlasts me a bit, and that people can hopefully continue to find value in. But I tried really hard to not get caught up with language stuff, I think there’s a lot of resources out there that can help out with qualitative versus quantitative, summative versus formative, there’s folks who already are doing that. There’s not a lot out there that really shows you how to in a housing context.

Dustin Ramsdell:
Yeah, I think that’s always a downside of content nowadays, is I think everybody wants to be the thought leader, so I think as much as you can get into, here are tactics… And that sometimes can just set a little bit of an expiration date on certain things. But a good piece of content’s going to have a pretty good shelf life, because even if it’s not modern right up to the millisecond, it’s like, oh, this was from this time, but I can take inspiration from what they were doing, and all that. That’s why I feel like certain textbooks just have a really good longevity and everything. So, yeah, I think I’m just seeing more people creating podcasts, or writing, or doing different things, where it’s like, we’re getting into just tangible, specific, relevant tactics for doing specific work.
Because at a certain point, it’s like we have enough people just espousing thought leadership and all that, that always is going to have its place. But it’s refreshing to see that folks are committing themselves to getting just really [inaudible 00:36:28], so appreciate you doing your part there on the assessment front. So, final question here, as the eternal optimist, always like to talk about what folks are looking forward to in their work, certainly you’re in your program, and you’ve got all that writing and stuff ahead of you and everything, but what are you looking forward to right now?

Natasha Monteith:
Yeah. So, right now I’m in the mix of realizing that the classwork in my program is wrapping up real soon, and I’m getting really excited to move back into student affairs. Started to poke around, maybe apply to a job here and there. But I’m really, really hopeful right now that hopefully a year from right now I can say, oh no, I am a student affairs assessment practitioner. That’s what I’m doing. And I’m over the moon about my research too. I just, within the last two weeks, finally found the tool I’ve been looking for the last two years. Somebody finally wrote a test, and I was like, beautiful, I don’t need to make a test now. So, I am really, really excited to dive into my research. I am patiently counting down until January, when I will only be working on my proposal, and hopefully starting data collection next May. So, I am just so ready. I feel like I’m a track person, who’s at the beginning, where you’re up against the blocks and you’re like, okay, just blow the whistle, let me go.

Dustin Ramsdell:
Yeah. Yeah, that’s great. Because yeah, as someone, for that metaphor, who’s going to do this huge marathon race, it’s like, yeah, they’re doing a lot of exercise and prep and training and all that, that’s part of the deal, is the process leading up to this one big moment and everything, and that idea of a doctoral program being this big commitment and everything, but it’s so different from a lot of other academic endeavors that people go on, because it is so hyper-focused on what you’re interested in, and want to explore and all that. So, yeah, really excited for you to begin this journey soon, and for us to all see what becomes of it. And just appreciate your time on this episode, and being a part of the blogging team at Roompact, and for sharing all your insights on the blog, and here. So, we’ll have ways to connect with the resources that you mentioned, to connect with you, to keep the conversation going. But yeah, thanks so much for your time.

Natasha Monteith:
Thank you so much. And genuinely, anybody who has questions hit me up on LinkedIn, I’m always happy to chat, and best of luck with everybody’s assessment efforts.


About ResEdChat

ResEdChat Podcasts

Roompact’s ResEdChat podcast is a platform to showcase people doing great work and talk about hot topics in residence life and college student housing. If you have a topic idea for an episode, let us know!

Comments are closed.

Up ↑

Discover more from Roompact

Subscribe now to keep reading and get access to the full archive.

Continue reading