Thought leader interview features discussion with Phil Hill of 'eLiterate'
Inside Higher Ed | December 6, 2017 - In first of a series of discussions with digital learning leaders, the eLiterate analyst explores the role of the faculty, measuring effectiveness and whether tech companies are getting more realistic in their promises.
From the perch he shares with Michael Feldstein -- as analysts, consultants and bloggers, among things -- Phil Hill of eLiterate and Mindwires Consulting has a cross-cutting view into the worlds of online learning and instructional technology.
Tech companies and colleges alike seek his counsel and (sometimes) bare their secrets and their fears, and Hill and Feldstein -- through their consulting advice and their analytical writing -- help college officials understand the tech landscape, and vice versa.
Sitting at that intersection gives Hill a distinctive set of insights about the state of the ed-tech landscape, some of which he shared in an interview with "Inside Digital Learning" last month. The conversation is one of a series conducted by Inside Higher Ed's Doug Lederman at the Online Learning Consortium's Accelerate conference in Orlando, Fla., last month.
The interviews were sponsored by OLC and Inside Higher Ed's "Inside Digital Learning" newsletter, and conducted on the Shindig video platform.
A partial, edited transcript of the conversation with Hill appears below.
Q: I'm curious about some of your takeaways from yesterday's Leadership Network conversation.
A: Part of the reason that I'm here this week, and part of the reason that we really enjoyed the session yesterday is we're getting better context in these discussions. It's not about the technology. It's really about how can we improve education, enable technology. And so that's what struck me. Yesterday, a lot of conversation came down to faculty resistance, faculty adoption, which are important subjects. But sort of the broader thing is getting to the point that, none of this is going to matter if it doesn't actually get used or you don't actually have people paying attention to what works and focusing on the teaching more than practices …
We don't have as much hype this week. So maybe it might be harder to write great headlines on what's happening with the conference … I think that the context and the discussion that went on here is … getting to the core issue of improving teaching and learning, improving learning outcomes and what is the proper role, what's the improper role for technology …
So you can have a particular technology applied the wrong way or to be applied the right way. So take support, for example. Almost anything, if you don't have support and professional development available for faculty and time for them to reflect and learn on what's happening, that technology can fail. And it's not really the technology. The usage of it. So how do you get to that differentiation between pro technology/what's effective. It's on the outcomes.
Even if you have something that's designed beautifully as a technology and even if in a white paper, in theory, the learning practice is superior for students, that's still not the same thing as can it naturally be adopted? So yesterday we had a lot of intriguing discussions about faculty time. Do they have time to actually learn this? Can they prioritize and figure out how to rethink what they're doing or how to improve what they're doing.
Q: So that's tricky in an enterprise -- higher education -- that hasn't always been particularly good at measuring effectiveness. Partly because, I believe, it hasn't been asked. There's been -- we're sort of in an era of evidence that is not brand-new, but pretty recent. And I don't think this enterprise has fully gotten on board with that yet. But there ... are challenges to measuring, especially if you're talking about quantifiable measures. And one of the interesting little threads from yesterday was that there are a lot of comparisons of online and technology-enabled instruction versus face-to-face and the comparable outcomes.
We've never been particularly good at measuring the outcomes of face-to-face. So … how do you deal with an enterprise that is adjusting, and talk about cultural change, do you believe that we're in an era of evidence and that, going forward, efficacy is going to matter more?
A: I like to look at this type of change by trying to say, erase the time and just take a step back and look at geological time. If you look at it from a long-term perspective, looking at decades, my answer is yes. I gave an example where some of the consulting I did 10 years ago, it would be based on educational technology and LMS adoption. And you would go into schools … and we were explicitly told … don't talk pedagogy. You're not allowed to even mention that when we select an LMS.
Fast-forward to today, people explicitly acknowledge the fact that, hey, this isn't working unless it improves or enables improved pedagogy. So we're now having that conversation. So from that standpoint, we've made a lot of progress in 10 years -- that you're at least allowed to say, this is what counts.
Now, you ask the question about evidence. In my view, we're sort of in the same way that we were talking about pedagogy 10 years ago, we're just now being allowed to talk about [evidence]. That's not the same as saying, we have a base of evidence about technology. It's good that we're at least acknowledging it.
Q: We often … see this bright, shiny object where a piece of technology comes up and everybody’s talking about it, as opposed to institutions identifying issues that they're trying to work on or improve on and finding whatever tools they can, technological or otherwise. Do you think institutions are using a sort of problem-based approach, or are they adopting technologies … in search of a solution?
A: You have the problem of a dean or a provost or a president taking an airplane ride, reading a magazine article about a MOOC and that changes everything. Or even worse, a state legislator, and you show up at -- there's a board or cabinet meeting, and it's, "We're getting into this. So there's absolutely a problem of simplistic technological solutions. It's hard to deny that.
Now, at the same time, if you look at the systemic changes, I guess I would argue that it is getting closer to a problem-based approach. That … the system or the nature of change is forcing people to deal with real problems that need to get solved. So it's almost like the technology and the hype leads it and triggers a change. But then underneath it, over time, the systemic problems are identified and start to sort themselves out in a frustratingly slow process. But I think it's actually happening.
Q: Where's your head space these days when you're at a conference like this? What are you looking to see?
A: Well, the biggest head space is the issue we were talking about, academic cultural change. Because for better or worse, that's what's going to drive the real effects of any of this digital learning type of impact. That's our biggest focus, if you will.
We also have, and this might be due to our personality flaws, a particular penchant for saying, hey, be accurate with what you claim. So we have a particular way of saying, if companies are marketing in a misleading way or even schools are presenting research in a misleading way, we fall into a role of sort of calling that out publicly.
So honest description of what's working and what's not working is important.
Q: One of the findings of our faculty survey that we didn't get to call out particularly yesterday. The technology enterprise, technology companies and advocates for technology clearly do themselves a disservice when they either overpromise or sort of mislead because that builds faculty skepticism, understandably. I have a sensed the beginnings of some modesty or less inclination on the part of the technology companies to pretend they have all the answers. Do you see that as well?
A: Yeah, I think the market is maturing a bit. I think the worst of that comes, if you're a company and your primary customer base is actually investors or foundations funding you as opposed to end users who really need your product, that's where you get the worst of the misses. And we called out some of those, including Knewton, I think is a primary case of that.
Now I also take Knewton, they've made a lot of changes … They're much more realistic in what they're claiming at this stage. They're no longer promising the robot tutors in the sky.
So, yes … companies are becoming more mature in their claims. Part of that's driven because schools are forcing them, saying, hey, we want something that we could back up, we don't believe you. But I think there's also a little bit of a function that you don't have the same frenzy of investment that you used to. So you don't have that push from the investor community.
So long and short, yes, I do believe companies are much more mature and you know, controlled in how they do their marketing than they were two to three years ago.
SOURCE: Inside Higher Ed