Episode 54 - Training Faculty to Use Student Success Data

Jeff Gold, Assistant Vice Chancellor for Student Success Strategic Initiatives at the California State University System Office joins us on this week's podcast from Long Beach, California. The trio discuss how the California State University System has innovated and improved the student experience by using data to inspire action among faculty.


Full Transcript

Steve Meredith: Hi again everyone, and welcome to Solutions to Higher Education, a podcast featuring Scott L Wyatt, the president of Southern Utah University in Cedar City, Utah. I’m your host, Steve Meredith, and I’m joined today in-studio, as I usually am, by President Wyatt. Hi, Scott?

Scott Wyatt: Hello, Steve, it’s great to be here today.

Meredith: It is great to be here, a beautiful spring day, and this probably is my favorite time of year in Cedar City. The wind dies down a little bit, we’re…it’s been a wet winter for us, it still remains a little bit wet but, the renewal of life. It’s a beautiful time to be here.

Wyatt: Yeah, it is a terrific time and our guest is in Long Beach, so, we’re going to assume that it’s a beautiful spring day in Long Beach.

Meredith: That’s right. You mentioned our guest, so, in our ongoing innovation podcast, we found a series of articles in The Chronicles of Higher Education about what campuses were doing to improve student life on campus in a variety of ways. Retention and recruitment, student jobs, and this one caught our eye in particular. Will you introduce our guest?

Wyatt: Yes, thanks Steve. So, we’re delighted to have with us today from Long Beach, California Jeff Gold, Assistant Vice Chancellor for Student Success Strategic Initiatives. Welcome. Jeff, and thanks for joining us today.

Jeff Gold: Thank you both for having me, it’s my pleasure.

Wyatt: So, what is it like in Long Beach this morning?

Gold: Well, I got a little chuckle because we’ve had what we call a “cold and wet winter” but I don’t think it holds a candle to what you folks have experienced in Southern Utah there. [All laugh] Today, like with you folks, it’s just a beautiful day. The sun is out and it’s just a terrific time of year as well over here.

Wyatt: So, you’re an assistant vice chancellor in the Cal State System. Why don’t you…we’d love if you would just kind of introduce yourself and your career path. How did you find yourself in this role?

Gold: Sure. Why don’t I start first with a little bit about the System? Then I’ll go into myself. Again, thank you for having me. It’s just a pleasure to talk with you folks, like-minded individuals, about how we can improve student success. So, I’m an assistant vice chancellor at the California State University System Office. Our chancellor’s office…as you may know, we have 23 institutions, we’re huge. About a half a million students across our 23 institutions. Some big, some small, some urban, some rural all the way throughout California from the north, Humboldt State, to San Diego State University in the south. And we take a lot of pride in being one of the most ethnically and racially diverse campuses and one of the most affordable university systems throughout the nation. And so, about half, a little more than half of our students, are PELL eligible students, so they’re low-income students, and about a third are the first in their families to go to college. So, we’re thrilled as a system to be able to serve these students. And I will say, as context for what I’m going to talk about today, traditionally we’ve taken a lot of pride of who we let into the front door, so, the access part of our mission has been all about making sure that we’re very inclusive and that our population mirrors, which it does, the great state of California. But until, I’d say, the last 15 or 20 years, we have spent less attention on the completion agenda on graduation and on success. And so, that has really been a focus of mine over the less several years is, “How do we intentionally focus on helping our students, and in this case, our most underserved students cross the commencement stage expeditiously.” So, that’s a large part of my work, and to answer your question, I came into it from kind of a strange background, not really an academic background. I began my career many years ago as a public school teacher in San Diego and from there went on to get an MBA and go into the private sector where I worked largely with education companies working on business plans, training companies as well. And then over the years, made my way into academia as an adjunct professor at Pepperdine University here in Southern California and finally, 15 years ago, landing at the California State University System office where I have been involved in a variety of strategic and academic projects with various constituents throughout our campus.

Wyatt: Well, it seems like those different experiences—high school, business sector, adjunct faculty, system office—would give you a broad perspective and maybe actually kind of a way to be more creative because you’ve seen so many different worlds.

Gold: I think that’s right. I’m able to pick and choose a little bit from my experiences and perhaps see patterns across our complex system that others haven’t. I will tell you that in my office, the majority of my colleagues have spent their lives in academia and there’s a lot of benefit to that as well. There’s tremendous expertise, for example, in education or in other disciplines that are directly applicable but, I agree. I think I bring a certain, maybe a broader, lens. Definitely not deeper than my peers but perhaps broader in certain areas when it has to do with innovation, creativity and applying everything that we know from maybe a larger lens to the student success discussion.

Wyatt: So, we love what you said about the fact that your System prides itself in access. A lot of first-generation college goers and very diverse backgrounds. Many universities pride themselves in who they don’t let in. [Laughs]

Gold: Yeah.

Wyatt: Meaning a very…a student body that’s almost predisposed or predetermined to be successful. And when you make it an intentional effort to invite students in who are less prepared, in a lot of ways, that makes your job so much harder, doesn’t it?

Gold: I think that’s right, but I will also say the flipside of that is it makes it that much more rewarding. You know, you folks may have seen that there was about a year and a half ago a report that came out of Stanford, some research out of Stanford by a researcher named Raj Chetty, who is an economist, who looked at really this idea of which campuses are helping students achieve the American dream. And what their research did was it looked at which types of students campuses were letting in. And so, in other words, they disaggregated the income levels of their students who were PELL eligible students, so your students who are already low-income, it looked at the bottom 20%, so your poorest kids, and then looked 20 years later at the IRS tax records to see what happened to those kids.

Meredith: Hmm.

Gold: Did the university education as one of the factors help to lift those kids, their families, their communities out of poverty. And what it found was exactly to your point, the Harvards, the Princetons, the more selective universities were tremendous at doing that. But the problem was that they let in so few very poor kids that even those poor kids end up having a better lot in life, there’s just very few of them. So, they’re overall impact is low. But when you look at many of the CSU campuses, in fact, one of our campuses, Cal State LA, came out as the number 1 campus for promoting this economic mobility. In fact, what they found is more than any other institution when you look at both the access side and then the success side, or the economic mobility side afterwards, Cal State LA was the best university out of more than 1,500 universities throughout the nation at taking in significant numbers of the lowest of low income students and then after 20 years lifting them to amongst the highest or second highest rungs of income in the U.S. And so, at a time when I’m sure you’re aware that the news isn’t great around the world, there’s a lot of challenges, certainly a lot of issues around economic mobility, the idea that the American dream indeed is not dead and that there are great institutions that are promoting this economic ascent is really heartening. And to your point, it is a very challenging work, but in my mind, it’s some of the most rewarding work that we do.

Wyatt: That’s what we deliver, isn’t it? We deliver the American dream.

Gold: That’s it, yeah.

Wyatt: Well, you have a very fascinating program that you’ve done relative to helping faculty get access to student data and then use that to make learning more effective and improve outcomes. Why don’t you talk to us about how that got started?

Gold: Sure. So, I think I mentioned I came to the chancellor’s office about 15 years ago. Well, not too long after my arrival we began a System initiative called our Graduation Initiative which used peer benchmarking data to take every one of our 23 campuses and set aspirational goals for both completions rates, so, for graduation rates and then also for our equity gap, so to close the differential rates between, say, you mentioned first generation students, so to lose the gap between graduation rates of our first-generation students and their non-first-generation peers. And as part of that process, we set those goals and then we had the campus presidents with their staff create plans about how they were going to hit their goals. What were their implementation plans? What strategies were they going to employ? And a couple of years later I was with a delegation of folks from our office that went to visit each campus and ask them for data and information about how they were implementing towards their goals. And I will tell you, it may come as no surprise, but the stories we heard were all uplifting. And, for context, we’re talking about the year, say, 2011-2012 when our budget in the state of California and, indeed, nationally was suffering tremendously. So, at a time of…within that context we go to campuses, we’d ask them how they were doing towards their goals, we came with a little bit of data and they showed us all sorts of wonderful things. They showed us their first-year programs, they talked to us about peer mentoring, they talked to us about supplemental instruction, all of the great things, undergraduate research, which were terrific, but when we looked deeper and we asked questions like, “For your students who participate in undergraduate research, how are your Latino students doing with that? Is it better than their peers? Which of your programs has the most disproportionately positive effect for your poor students? What is the participation rate of African American males in your peer mentor program? We asked some very basic questions about the effectiveness and the likelihood of hitting their overall strategic goals based on their current implementation plans, and what we found is that very, very few people, the program directors, the administration, can answer those questions. And so, to me, it became really clear very quickly that in my office centrally, we collect and inordinate amount of data. We always have from the campuses. But our goal has always been, until the last few years, to do that for compliance reasons. So, we report the data to the federal government for IPEDS reporting, we report it to the state government for compliance reporting and to our chancellor and the such, but we had never until five, six years ago reflected that data back to the campuses, and especially not in meaningful ways. So, the whole philosophy of this program that you read about in The Chronicle was to leverage some system-wide tools that we’ve developed over the last five years, specifically a student success dashboard, to engage increasing numbers of campus constituents in understand the data and then secondly, appreciating how they can apply it to their practice. So, one thing I’ll mention just off the top of my head that became apparent quickly is that we needed to disaggregate the data down to the department and down to the major level. And so, even though we had all of these goals and we had fairly strong buy in from senior administrators on campus, to be frank with you, it used to be that we would visit a campus and you would run into a faculty member and they were, at best, only vaguely aware of the goals that we had set as a System and for their campus. And to be frank, it was hard for them to feel bought-in to that. So, for example, if you were a chemistry professor at San Jose State University, you cared a lot about chemistry students that came into your classroom, but the idea of an entire campus goal for the year 2025 on graduation rates or closing equity gaps by a certain date, you thought that was great but it was hard for you to see the relationship between those large strategic goals and what was happening every day in your classroom with your students. And so, in a nutshell, that was the philosophy of both the system-wide dashboard and of this program that we developed is, “How do we get these data, data that we know that our faculty care about, into their hands in ways that they can use it?”

Wyatt: So, you’ve got all of this information that’s sitting there unused except for compliance with federal government programs?

Gold: That’s right.

Wyatt: And how to we take this and make use of it? That’s a…

Gold: That’s it.

Wyatt: Yeah. It also makes us happier to collect data. [Both laugh]

Gold: It gives us a sense of purpose, that’s right.

Wyatt: That’s right. So, give us an example of a project that came out of this effort?

Gold: Yeah, I’d be happy to. So, maybe a couple of examples to clarify it…before I do that, let me just give…it’s really hard over a podcast to present this, but one of the things that we pride ourselves on in this office is our ability to bring the data to life. And so, if you were to have…if this were a screencast instead of a podcast, I would be able to share with you images of our dashboards which are often moving people to illustrate concepts. We use animations, we try to bring the data to life in a way that people can understand quickly the meaning behind the data they’re seeing and how it applies to their practice. It takes a lot of time, but we find that it’s really important for people to connect at a human level with the data so that they’re not just staring at a bunch of numbers and percentages but that we’re really thinking about the individual numbers and the impact that they have on the individual’s lives that come through our classrooms. So, in terms of how that manifests itself in terms of the program and what this looks like for a faculty member, we started by asking our faculty questions about what they know about their students, their success rates, what happens to them before and after they come to them, do they know if they change majors? How much do they know about their students and the curriculum above and beyond the walls of their classroom? And we found that, overall, that information is really limited and yet it was an area that everyone seemed to be very interested in. So, as an example, one of the graphs that we have established as a result of it is a time-series chart. So, we go back in time six years ago because we want to look at the academic trajectories of students all the way down to every major throughout our campuses. So, if you could imagine, you come and let’s say—we’ll go back to the San Jose State campus—let’s say you’re that chemistry professor at San Jose State, and let’s say that your specialty is in the introductory courses and that you’ve been teaching, let’s say, General Chemistry for years and years and years at San Jose State, you probably know a little bit about your students who come to you. You maybe know which high schools they come from, you maybe know a little bit about their academic preparation, but you often don’t know, “Are these chemistry students that came to San Jose State with a declared major in chemistry wanting to be chemists?” Maybe they’re nursing majors, maybe they’re engineers taking chemistry for other reasons. That kind of information is really important for how you develop your curriculum, but you don’t have that information typically. You also don’t know what happens to those students afterwards. Do they change their major after your course? Or do they go on to complete their degree? These are questions that you probably would like to know, could help to inform your pedagogy to support the way you think about your classroom, but that data is often blind to you. And so, what we’ve done with the time-series charts I mention is, if you can imagine, we have created icons of actual students, so stick figures of students, with a time series moving from left to right where the Y-axis is the year, so it moves from 2012, ’13, ’14, ’15, ’16, ’17, ’18, the six year period, and then on the X-axis as the students move up, you’re seeing their credit accumulation pattern. So, in the first year, ideally you would move from zero to 30 units if you’re on a four-year plan. The second year from 31 units to 60 units, all the way to 120 units and completion. We also color-code these students and if they turn red at a certain point, that means they stopped right there, and they dropped out. If they turn green, it means they’ve graduated. So, this is really one of the first times I’ve tried to describe this graph visually over a podcast, but if you can imagine, for every single major that we have throughout this huge university system, we can look at every student who comes in and look at patterns visually to see, “Am I seeing a lot of red in the first year under 20 units? If so, what does that say?” What does that department chair, for example, in chemistry at San Jose, what questions would I ask about what’s happening in that first year that’s happening in the classroom and maybe outside of the classroom for students to be experiencing so much attrition in that first year? If I’m seeing, for example, towards the later year’s students are leaving, hopefully I would do some research and find that these students are getting amazing paying jobs and maybe they’ll come back, but oftentimes, we don’t know what’s happening. In some cases, they’re financially needy students who are running out of grant funding and simply can’t afford to take those last two or three courses. And, in an example like that, it is just such a tragedy for the students themselves and, indeed, for the state of California to have invested years and years and years of preparation for this student to come just a few courses, a few units short. So, this is just one example of how faculty can come to a tool like this, they’re not put off or scared by the technology, they can click a button and quickly see for students they know, for students they relate to, what is the pattern of either continuing through towards degree or dropping out after a certain period of time and what role to they have and does their department have in creating either a more inclusive or supportive or welcoming pedagogy, by implementing different types of standards, for example, flipping their classroom, providing supplemental instruction outside of the classroom, the idea is to change the philosophy here from, “My job as a faculty member is to just teach this lesson and hope that people get it and if they don’t, too bad for them” and instead say, “My job is to facilitate learning, and certainly not every student will pass, but I can go home and sleep well at night if I know that I’ve done everything I can to provide every opportunity for my students to succeed.”

Wyatt: Well, Jeff, you are very descriptive. I could see that chart, that dashboard, in my head easily. [Laughs]

Gold: How about that? [Laughs]

Wyatt: So, if that’s the first time you’ve described it, you would be a very good teacher.

Meredith: Yeah.

Gold. [Laughs] I appreciate that.

Meredith: Jeff, I’m our accreditation liaison officer and also in charge of our institutional research group and I have a couple of questions for you. The first one and by far the more important one is, “So how do you roll this out to faculty?” Are there seminars? Is there training? Exactly what have you undertaken with them to help them infuse this student data into their thinking?

Gold: Yeah, so that’s a good question and there’s two ways. One is that it is password protected to CSU faculty, staff, and administrators. So, we have 50,000 faculty, staff, and administrators. They all can come and get it whenever they want but I’m sure you know and I’m sure you appreciate just because we have something that’s valuable, it’s another thing to get people to use it. I will say though that…and so there’s a strategy that I’ll talk about in a second, but I’ll say that my team is not quick at creating new visualizations. It takes us time and one of the reasons it takes us time is because I challenge my team every day to think about people that are outside of higher education with the idea that every visualization that we create needs to be intuitive enough that doesn’t require training. That any of our spouses outside…or friends outside of higher ed could take a look at the graph and get a pretty good understanding of what we’re trying to convey. So, we’re not always successful with that. As you know, Steve, the data can get very complex and it can get very specific to higher ed at times, but we do a pretty good job of making our interfaces intuitive enough so that someone can come in, someone can hear about the dashboard from a colleague, a peer or come to one of our seminars and then they can come in and freely explore without having to have significant training. So, to answer your question, we have two elements. One is that we try to get the word out through our campus’s administrators, through our faculty senate and through any way we can that this is a valuable tool and that it’s set up for the campuses. Then there’s two other ways that we have gotten people to use it. One is we have created a customized email message that about half of our campus provosts have sent to all of their faculty. So, for example, we created an infographic that would say, let’s say you’re at our Cal State LA campus and let’s say you’re a psychology faculty, we asked the provosts to give us, or the academic side of the house, to give us a list of all of their faculty members’ first name, last name, email address and then their department. And then, on behalf of the provost who sends a message to explain the context for this dashboard, we send out via email from the provost a customized message that has links to different parts of the dashboard. So, if you’re that psychology faculty member at Cal State LA, you might get a link that welcomes you to the dashboard and encourages you to use it from the provost and at the bottom is an infographic with buttons that say, “Click here to see how many underserved students in physiology at Cal State LA are leaving in the first year” or, “What is the GPA gap? The difference in grades between first-generation students in phycology courses at Cal State LA and their peers?” So, we’re really personalizing and teasing the very important data that we know these folks are going to be most interested in and then we can track how many people are coming in. And sometimes we see that email message comes out and 50%-60% of the faculty are into the dashboard within a day or two because it peaks their interest. So, those are two…one is just if they happen to find it, we create intuitive dashboards, two is we strategically try to target email campaigns from people they know on campus. And then the third way, obviously, is this program that you read about, this Certificate Program in Student Success Analytics and that really allows us to go in at a lot deeper level. We can really roll up our sleeves, get together as a learning community, not only on our own campus, but across the system, to look at different data points and ask questions together about how to improve practices.

Meredith: So, my second question about that, about your process, is, and I mean this in no way to throw my colleagues under the bus at all, it’s just that in the IR area, I supervise people who are data wonks. That’s their job and they are exceptionally good at it. You mentioned challenging your staff to create stick figures and other visual items that are going to help people outside of our life or our circle of the way we look at things to easily understand the graphics that they might see or easily be able to interpret the data. Do you have just a staff of statisticians? Or do you employ graphic artists as well? Or how have you moved forward with that? Because we have a terrific data dashboard system that we’re very proud of and have been nationally recognized for. But I…one of my challenges with it is that it’s just not very exciting to look at and if you’re outside of our realm, as you suggested, it’s a little bit hard to interpret.

Gold: Yeah, so it’s a good question, and we do. So, my team is very small, but we do have…I do have a graphic designer and she is instrumental to the process. Creating new dashboards is always iterative. I will sometimes be in a meeting or on a plane and get out a cocktail napkin or a piece of paper and sketch out something. But we always go through a process, an iterative process, and everything always goes through our graphic designer. There’s also a component of instructional design there as well. We are trying to elicit action. We’re not trying to just create pretty figures or engrossing animations, and so, we challenge the designer to think about, “What is the one, two or three takeaways that every person that comes to this graph should have?” And if that’s not apparently clear, we go back to the drawing board. So, I think you’re not alone. I think when you say the word “dashboard” or if you look at the different dashboards from different universities systems you’ve seen, they’re typically built data feed people for data people. And there’s nothing wrong with that. In many cases, those are the people…that that’s the intended audience.

Meredith: Right.

Gold: But, for us in the California State University, we were very intentional about drawing a larger target audience and knowing, frankly, that with how busy people are and with how intimidating data can be, that if we didn’t go out of our way to really think through the design process and make the data welcoming that they may come and take a look but they certainly wouldn’t internalize the data and much less, act on it.

Meredith: So, what have been the outcomes? Can you draw straight lines to this infusion of student data into the lives of faculty members and see outcomes that are measurable for you and that are in the best interest of students?

Gold: Yeah. So, what I’ll…I have to tell you now, it’s mostly anecdotal. We will be conducting some research on this, but for now, what I can tell you is that there’s two answers to this question. One is the level of excitement, the level of efficacy, the level of just enthusiasm around the data certainly for our program from day one and when we meet the first time as a learning community to the end, it’s a pretty incredible transformation. So, the types of questions you’re having typically in day one is people questioning the accuracy of the data, making sure they’re comfortable with you and what you’re trying to do and the such. By the end of the session, people have really started to peel back the layers of the onion and really get into the sense of…so, I mentioned in the example of that dashboard, what’s happening in the first year. So, if a department has identified by the data that there’s a problem the first year, there’s all sorts of questions, intelligent questions, that then follow about their role in changing the curriculum and measuring those changes and building buy-in for that. And so, the first part of the answer is that there is this culture of evidence that that, through the program, we can see evolving through the discourse. And it’s really something to behold. By the end of the program, the conversations that you have are so much more adept and so much more student-focused. Obviously hard to measure. I will say, there’s another component where I’ve seen the most impact is we have another dashboard that has grade distribution data for every course that’s taught at the SCU up until this last fall. And what you can do, again, say you’re that…let’s say you’re that chemistry professor that I keep mentioning and referencing at San Jose State, you can look to see overall the courses that are taught, either in your college, the College of Sciences, or in your department of chemistry over the last few years. You can see how students are doing, number one, but we’ve disaggregated the data so that you can see, in the courses that you teach, so that General Chemistry course, you can see, “What were the outcomes over the last few years for my students of color versus their peers? What were the outcomes for my first-generation students versus their peers?” and so on. And so, given the fact that the SCU, as I mentioned before, has such an equity mission, it’s really in the fiber of who we are, we get a lot of faculty who choose to work for the SCU because of that mission. And when they take a look at their data, many of them believe in their hearts, and they are, they’re champions of equity in their everyday lives, but we had…you know, I can give you an example. There’s a political science professor at one of our campuses, Northridge, who has said on many occasions that she came into the dashboard and found out for the first time that she’s been talking the talk, but in her own classrooms, there were huge gaps of performance between students that she had been unaware of and shocked to the point of it being an emotional thing for her to see that she was part of continuing this inequity in her course. And so, she has changed her pedagogy, she’s really looking at additional support structures. That’s one example. Another example is one of my colleagues that’s working on this project with me, I have her on load as a faculty member from one of our campuses, Cal State Long Beach down here. She went back to campus for an event a couple of months ago and ran into her department chair, mentioned what she was doing, and her department chair said to her, “Oh, I use the dashboard all the time. It’s terrific. In fact, one of the faculty members, I noticed, in my department, there were real disparities in the grading between students of color and their peers. And normally that would be kind of a taboo subject or something that would be really difficult for me to bring up, but I just walked over to his office, asked him if he had seen the dashboard, we looked at the data together, and it just made a really easy transition into him saying, ‘Wow, that’s my course. I never knew…’” And so, again, we haven’t conducted thorough research on this, we will conduct more research, but at least anecdotally we are hearing and starting to collect stories of folks from around our campuses that are internalizing these data and beginning to see their role in promoting the equity agenda in their classrooms.

Wyatt: We don’t know what we don’t know. [Laughs]

Gold: That’s it, yeah.

Wyatt: That’s a fascinating example of…those examples you just gave, and I love the one about the political scientist who believes with all her heart that she’s doing something, but then the data shows that it’s just not happening.

Gold: Yeah.

Wyatt: What an opportunity for her to do what she’s trying to do.

Gold: That’s it. And we’re hoping to try to catch some of these stories on video. Again, if we get the time, the money and maybe some funding, we’d like to do a more extensive research study to quantify this. So, in the meantime, we’d like to capture these stories just so that people can understand the impact of really looking through the data, rolling up your sleeves and wrestling with can have on your practice.

Meredith: Yeah, they’re compelling stories, just as you’ve described them here.

Wyatt: Yeah. And so, what does…what kind of support or…take us to the next step? Do you have an example or something about, like this faculty member in political science, what was she able to do with this information?

Gold: Yeah. So, her specific example I haven’t followed up, but in many examples that are similar to things I’ve talked about in terms of just starting to look at your own pedagogy. So, it starts with an awareness. It could be as simple as, “I’ve never noticed this, but now that I’m tracking it, I can on certain students all the time” or “I don’t have collaborative activities that allow for increased inclusiveness in my discussions” or “If I were to flip the classroom and make it less about me and more about classroom discussions, that might help.” And so, there’s a whole body of research about creating more inclusive pedagogy and strategy. There’s also a body of research about what can happen outside of the classroom with wraparound support services, providing supplemental instruction, tutoring and the such, and so, in the case of that specific faculty member, we need to follow up. I’m not sure. But those are some of the tools that are out there. What we try to do, also, is partner in some cases with faculty development centers on our campus. And so, obviously the data highlights in many cases the potential for changes. Given the scale of our system, it’s beyond the reach of my office or our capacity to be able to work individually with faculty to transcribe the meaning of the data into action in their classroom. But in most cases, they can go to their faculty development centers. You know, bring that data with them or explore it further and them come up with a plan to really make a change and measure that change in their classroom.

Wyatt: I’ve taught…I’m the president of the university and so I’m not really a faculty member, but I teach as an adjunct and I have been teaching kind of in an adjunct role for the last 11 or so years, and about ten, nine or ten years ago, a good friend of mine who has been a civil rights advocate, Martin Luther King when he was really young, asked me one day, a class I teach is American National Government, he said, “So, how many African Americans do you talk about in your course?” And I thought, “Wow, that’s an interesting question. I’ve never really thought that through.” And then to continue it on, “How many Latinos do I talk about? How many heroes in our country’s stories do I refer to that are from all of these different groups so that the students in my class can identify with somebody that they’re with, that they see themselves as a part of, instead of just, ‘The heroes who are all others not from my community.’”

Gold: Yeah, I think that’s usually important. And the other thing is we have…so, part of our program, the Certificate Program, is we convene face-to-face to get the learning students together and then we have six hybrid sessions. So, they’re really online where the campus teams meet on their campuses for two hours, but the first hour, my team sponsors. So, we deliver over a Zoom call, which is like a webcast call programming, to them about the data. And last week we had one of our professors talk about implicit bias. And we had all of our participants take tests to try to uncover some of their unconscious associations that may negatively affect their impressions of certain groups of people. And so, just having these conversations and becoming aware of them, similar to your comment about including more multicultural examples in the curriculum, these are all things that most faculty just don’t think of. They’re not taught the vine, they’re not a way that…when I went to school, when I was taught, those were largely absent. And so, I commend you for thinking of that and challenging yourself to include those references into your course.

Wyatt: Well, I was provoked by a friend of mine who just asked the question. You’re provoking an entire system by showing them the data. [Laughs]

Gold: That’s the hope. [Laughs]

Wyatt: That’s the hope.

Gold: Yeah.

Wyatt: So, what do you…where do you think this is going to evolve? If the world was perfect, where does this land you in about five or ten years?

Gold: You know, that’s a good question. Less about me, I think…I gave a presentation a couple of months ago to a group of non-profit organization leaders. And, I have to tell you, I was shocked by how interested they were. I didn’t think they were going to be very interested. It’s not their field, it’s data. But…and I have the advantage of being able to show them what we’re doing. But I think the takeaway that I try to leave with that really resonated is, “No matter where you are, our institutions are very different than southern Utah, I’m sure, and there’s great variation among the industry as well.” but I think what really resonated with those folks from the non-profit sector is no matter what we’re doing in life, whether professionally or personally, most of us have some set of goals or idea of where we want to be. Sometimes that’s looser than others, but certainly with accreditation and in higher ed and in business, we’re often thinking about, “Where do we want to be? How do we want to improve?” And I think the more people you get, across the organization, thinking about that as often as possible in a non-threatening, non-compliance type of way, but if the question is how would that director or undergraduate research on one of our campuses who presented that great presentation years ago get up every morning and think, “How is what I’m going directly tie to the university? How am I going to measure it not on an every decade basis? Not even on an annual basis, but maybe term to term, or even quicker, how am I going to know if I’m hitting the mark and how can I commit to making changes for improvement?” I think that’s where, in five years, I would like to help the CSU become more of a learning and improvement organization. Higher ed is notorious, and we’re no exception, for the slowness of change. They talk about the higher ed scale of change compared to the industry, and I’m sure you folks know, it’s stereotypically low. We’re typically comfortable where we are.

Meredith: Cold peanut butter. Cold peanut butter.

Gold: [Laughs] There you go, yeah. And so, my goal for five years for us and for more institutions is to empower the folks who are really on the ground. Our faculty and on to student affairs, the people who are running a terrific program, to be thinking about this, not on a regulatory basis, but on a continual, professional basis and to have the data that they need to understand the impact of what they’re doing and how to improve it. I mean, our students, as they come to us, are going to require more and more what I call “personalized learning of scale” and the way to get there, I think, is by really being artful in the collection, analysis and provision of data. That can help people. That can be somewhat turn-key in their progress and make changes and then measure again.

Wyatt: Jeff, I think that…I think our conversation has helped us think through the opportunities that great data can do, can be, in helping us improve our outcomes and questioning our assumptions. Because now, we’ve got a way to verify that we’re accomplish what we’re thinking we’re accomplishing, or hoping we’re accomplishing. But even more than just that, for a faculty member who may not have access to this data, just the fact that we’re talking about it, hopefully, will provoke them to think, “I wonder if I’m accomplishing what I want to accomplish, because apparently, that’s not happening uniformly around the country.” And there are some simple ways that faculty could collect data for their own class in their conversations and maybe small surveys in their own class and then watching their own numbers. So, the level of sophistication that institutions bring to help faculty become successful needs to continually improve. But maybe our own faculty members can see in this opportunities for them to encourage us to continually improve. But also, that they can find some of this data, although on a less sophisticated way, on their own, probably in their own classes.

Gold: I think that’s absolutely right. One of the things I often get when I show what we have done is, “My God, you guys are way ahead, what can we do?” And I think you just answered that question beautifully, which is, we’ve been thinking about this maybe for longer and maybe have had more resources than most institutions. But you start from somewhere, and you don’t need beautiful graphs and you don’t need a fancy program. To your point, you need to start collecting data. Maybe it’s survey data from your students, maybe it’s just looking at the grade distribution in your own classes. That’s a great place to start. It doesn’t have to be fancy, but there is that commitment to continuous improvement that you mentioned that I think is so critical.

Meredith: You’ve been listening to Solutions for Higher Education, a podcast featuring Scott L Wyatt, the president of Southern Utah University in Cedar City, Utah. We’ve had, joining us by phone from his office in Long Beach, California, Jeff Gold, the Assistant Vice Chancellor for Student Success Strategic Initiatives and we’ve been discussion the incredible use of data in the CSU System to drive continuous improvement in that system. Thanks to our listeners, thanks to Jeff for joining us, and we’ll be back again soon. Bye bye.