Insights Xchange: Conversations Shaping Academic Research

Lisa Cuevas Shaw on Culture Change, AI in Publishing, and Research Funding

July 12, 2022 ScienceTalks Season 2 Episode 2
Insights Xchange: Conversations Shaping Academic Research
Lisa Cuevas Shaw on Culture Change, AI in Publishing, and Research Funding
Show Notes Transcript

Nikesh Gosalia and Lisa Cuevas Shaw discuss open science and its major challenges in depth, covering AI tech to research funding. Lisa shares her thoughts on the funder’s role in driving change and addresses the unique issues that currently affect research funding. They tackle another very relevant topic for researchers — the use of AI-based tools in research. According to Lisa, the foundation of AI tech lies in the judicious use of tools to make processes easier. She also talks about the Center of Open Science’s efforts to drive culture change via the use of intuitive tech tools. Nikesh and Lisa then explore the impact of open science on the scientific community, as well as society at large. The conversation ends on a light note as Lisa talks about her foray into the winemaking business.

Lisa Cuevas Shaw is the COO and Managing Director for the Center for Open Science, and an adjunct professor in Management at Pepperdine University. Lisa has extensive experience in the publishing world, having previously worked as COO and a Deputy Publisher for JMIR Publications and a Senior Vice President for Sage Publishing. She can be reached on Twitter

Insights Xchange is a fortnightly podcast brought to you by Cactus Communications (CACTUS). Follow us:

Do you think some of the things that you’ve mentioned are the major challenges that prevent an organization from transitioning to open science? And in your experience, why does it take so long? I mean, is there anything that organizations can do do in particular to adopt open science practices sooner? I think there certainly are challenges and there are probably too many to list. But just to call them out and then to maybe shift into how do you approach this or why does it take organizations too long. There are just things that are so entrenched and institutionalized. So, as I mentioned the reward system, so especially promotion and tenure at universities, it's not set up to reward open science. So, that leaves some researchers or communities kind of disenfranchised from the movement thinking, why should I go through these seemingly extra hoops if it doesn't benefit me or support my future opportunities to contribute to society through my research. That's a real valid issue, right. It's not a selfish one, it's just that the system is not supporting what we value. So, there's again no blame game there. That just is what it is. There's not widespread adoption of rigor or reproducible practices across communities. So, there's a lack of knowledge and access to the tools for open science. We are trying to change that at the center. Also, a couple of the challenges, we have a ways to go to make open data or other outputs actually useful. So, again, that takes infrastructure, learning, and commitment effort around the tools development. We are not just talking about archiving any old output, we have to standardize the output so that it can be reused if it's actually useful. And so, these are some of the things that make open science kind of like, well, is it really worth it? Are we going to spend a lot of money and not use what we're doing? The other thing that exists I think with open science and where you get a little bit of kind of lag as well is there are perceptions that the tools created for reproducibility in particular of studies are all oriented to confirmatory science rather than exploratory. So, that doesn't help the cause at all. There are many researchers again who might feel disconnected to or disenfranchised, or at least kind of misunderstood or skeptical about the goals of registering your studies, making your studies more open. So, I think there are other challenges. But kind of the flaws in the system, if you will, are largely related to a lack of widespread support and kind of alignment in incentive structures to help with greater adoption. I think that why does it take so long and what can organizations do about it? This is all culture change too. And culture change is really hard, it’s deeply entrenched. And this is complex, decentralized system with lots of communities, lots of tools, lots of things, communication channels, so the system is not currently or not yet aligned with enough support. And so, systems theory would say, well, systems actually prefer homeostasis, right, it's much more comfortable to go back to the normal. And so, it takes a combination of disruption, norming of expectations and values and real hardwired support across several facets of the research endeavor to realize change on a broad scale. And I think we are getting there with a lot of things have accelerated in the last few years. When you look back on it, those disruptors and maybe some of your listeners would know, Plan S, right, or cOAlition S, that's a good one to call out, at the time that it first came out with real demands, right, like this must change, these models, business models must change, or open access, all of that must change almost overnight, right. They seemed a little unreasonable considering what really needed to happen from a systems perspective across multiple stakeholders. And in some cases it was, but they did actually realize some change along the way by being pushy and disruptive. And so, in effect, those types of things contribute as do collaboration alignment, and I think that's where we are getting to right now. But for organizations, I think it just depends on your goals, and really figuring out what's possible in order to sustain your own kind of unique missions, and then working with others trying not to repeat, to have redundancies or try to join up and align where there's an opportunity to reduce just the cost that it takes to transition. I think that's a real concern if it's a valid one across pretty much every stakeholder in the research system. When you were talking about Plan S, a question that popped in my mind. We are increasingly talking more about the importance of the role of a funder. And so, from your perspective, how important is their role going to be, say, in the next 10 years? Yeah, I think a funder is critical in driving culture change. I don't think there's any way around that one. When it comes to matters of being able to actually do the research and matters ultimately of compliance, we know that it can be had by those – that influence can be held by those who hold the proverbial purse strings. And that's just what it is. And the wonderful thing is that we've seen tremendous efforts and innovation among a range of funders who are trying to – many of them trying to kind of cut back on the time and the requirements it takes to submit a proposal to kind of get things funded, so lots of interesting dynamics there, a lottery system so that funding is more equitable. There's a tremendous opportunity I think for ongoing innovation and influence by funders, but we've had a lot of success and see a lot of movement in the funder space, whole coalitions who are coming together and adopting open practices and making open practices a requirement of the funding. And so, I think that's going to be pretty critical ongoing for the next many years in terms of advancing the open science movement. Just talking about funding, it's been a lot in news over the last 12 to 18 months, I mean, with all the political tensions and Brexit and a lot of other priorities, let's say, the governments. It could be the pandemic, it could be the climate change, it could be just the energy crisis. What is your take on research funding? A lot of conversations that are happening that funding might be reduced, but ironically, that could be catastrophic considering how important it is. I hear you. I think there are a lot of unique challenges today, in I'll say today's age just to kind of place it. A pandemic obviously is one that comes to mind. And even real choices about national interests versus global interests, there's a limitation of resources, how do you navigate that? I don't think that's going to go away. That's a challenge that’s real. And as a leader of any – a governmental leader or other IGOs, NGO, all of that, you've got a lot of challenges to think about when there are constraints involved. And then there's just the philosophical obviously, and political dimensions of things. And so, I recognize the diverse and complex challenges and economic challenges that funders face, and specifically government funders but also private as well. And yeah, it would be catastrophic if funding were reduced. At the same time, I think about how can we make – it's almost all the more reason to invest in open science practices or greater efficiencies so that we're using those funds in a more effective and efficient way. And also driving – I think this is where technology improvements, enhancements can help us understand how to make better investments. And so, that should be kind of a community organized thing out there. A lot of stakeholders can actually improve the use of funds. And so that's not simple either. So, I am making it sound like yeah, you know. But there's promise there to make actually research more efficient therefore in some respects the use of funds and the return on investment, better. Lisa, you are very passionate about open science, very optimistic, you are a risk taker. But in your opinion, do you ever see a world with fully open access science? I actually think that we could see open science as the default practice, yeah, open access to – I mean, I think we have to sort through still a lot of the models, there are many new models that exist beyond transformative agreements, so now more community-driven things. I think there's a lot to work through in terms of inequities and gaps and access and all of that. But I do think that we could see a world where open science practice is the default at a real practical level. Even in that world there'd be plenty of opportunity for researchers and communities to maintain agency over which aspects or when aspects of their work should or need to remain private until ready. There's a lot of just different use cases where that makes a lot of sense. But yeah, I do see a world where science is open as the default. If you reflect on just the last 10 years, in a short time we've seen a pretty profound change in open access. I think the next wave, and we are already there, but I think where we'll see a little bit more broad scale community norming is around open data, but I think there's a lot of issues to work through there too, infrastructure process, protocols, evaluation, is it usable, privacy concerns, all of those things need to also be addressed. And there are lots of change agents and leaders and communities who are working on those things. Absolutely; moving on to the tech side of things, over the last few years in my own experience I've been seeing a greater shift towards folks being more open minded about talking about technology, adopting AI based tools. In your experience, do you see the same, and what kind of technology or AI tools do you think are prevalent in this space? It's an exciting and critical time I think for the application of AI and machine learning to the scientific publishing and the research process more broadly. I think when I reflect even just a few years ago, and some of the conversations you had at conferences and there's just this buildup of AI. And it almost felt like we didn't know what we needed to apply it to. I mean it was just more like here's the technology and then it was like, what problem are we solving? And I am sure those who are out in front obviously had a vision and continue to have that vision. It's becoming more obvious in terms of the application. And I think at the heart of it, not the only thing but a big foundational component is solving for efficiencies and improvements across a range of elements in the publishing and research lifecycle. So, you have everything from a technology like DataSeer, they can scan large bodies of either articles or text and look at, allow data availability. So they can scan for data availability phrases, kind of train the data to understand when there should be something that then should be linked to an output. And so that lets authors,- publishers, funders, many stakeholders know that they could or should make data available at a certain point in time. But doing that manually, I mean that's a process, right, that takes a lot of human effort to do that. So, that's a basic tool. And it's not a basic tool, I am sure they've invested a lot in making it run effectively, but that's an example of making something more efficient in terms of looking for open access to data, or code or that type of thing. Peer review too, so AI applied to building either more engaged peer review communities, so finding peer reviewers is a problem, right, for publishers, not saying anything new, but also applying AI to some basic checks, right, so either from an editing perspective or the quality of the paper. So, there are a lot of frameworks, as you know, specific to either certain types of studies or communities that are being applied with AI and machine learning, then scanning, again, removing some of the human checking, just to move it along the pipeline. I think that's interesting. One of the areas that's really interesting for us at the center is we are really trying to advocate for and support the research process being a self-critical and self-improving endeavor. And so, what we are also seeing, and I think probably many of your listeners are seeing, is AI for assessment of research, so including the assessment of credibility of the findings themselves. So, this is an exciting one that has huge implications for greater access among all people to actually translate and understand research and apply it. But it also has a lot of implications too in terms of I think AI in general could, if left unwatched and untamed, could kind of perpetuate biases that already exist based on certain criteria that we look for, right Citations, we know, are already biased based on certain demographic considerations or other, so we have to also be really careful that the tools that we are applying do more good than harm, and then just continuously improve it and understand what we are actually getting from the outputs. So there are many other applications and developments I think. Really exciting time. But I also think there's still the need for human element and human judgment and oversight obviously, I think, by the way, DataSeer, if I am not mistaken, got the maximum number of votes at SSP for one of the products to watch out for. Oh, okay. Yeah, that was really interesting. And I know that the Center has been working on or offers a lot of open source technology. What kind of innovative technology is the center working on, anything that you can talk about? I mean a lot of our tech I would say, and I wish I had our product office and our tech team here to talk about it, but I think a lot of our technology is focused on some of the stuff that's maybe not as exciting but so critical, which is user interface, user experience, making things easy and intuitive, those are still foundational. Especially when you talk about culture change, right, it's not always the exciting stuff, it's the stuff that made the interface elegant and useful and simple that's actually going to then gain traction and advance your mission, right. So, user interface is, intuitive, useful. Interoperability. So, we are big on trying to say, look, we are not about the open science framework, we are about contributing to supporting open practice across the research lifecycle and we want to integrate with the tools and services that researchers need. So that's a lot of our focus. I think I touched a little bit on research assessment tools and I think that's an area of opportunity for us. But I think we are in the early stages even from a research perspective to want to make sure that we are good stewards of any AI technology algorithm development, and that that ultimately is community owned, right. We are not going to kind of go off and do this on our own but rather the way to ensure trust of those algorithms is to make sure that you are involving many people across different communities. But we are looking at that, at research assessment tools to further research community's ability to be self-critical throughout the research lifecycle. And so, the natural entry point for us to apply these assessments might be at the preprint stage. So, OSF works from project ideation all the way through to OSF preprint, so a dissemination. And you could easily see us applying research assessments before or right at the preprint stage. And those are some things that I think we are keen to explore at this point, yeah. Finally, I want to move on to the impact side of things. Over the last few years, again, we've been hearing a lot more about impact, dissemination, engagement. What kind of impact do you think open science has on the community at large? If you're talking about – I’ll start even with the scholarly communications community at large, I think the impact could be profound insofar as it could support even greater trust and also a greater demand for the role of vetting, disseminating, translating, and applying rigorous research to solve problems and to serve people. So, there's still a lot that needs to happen in translating outputs, right. And so, scholarly communications community can take a greater role depending on what focus or the core competencies of the organizations are. If we are talking about society at large, I think open science has the potential to be the antidote to mistrust, right, so greater transparency. Well, open it up, right, and let's make sure we're all educated on and learning about how to vet, critically vet what is open and available to us. So, there's an opportunity I think to kind of combat mistrust. And again, disinformation, misinformation, those are big things that aren't going to go away. And I think, again, for society, open science has the potential to drive greater quality, efficiency, and equity of the scientific endeavor, which should result in faster cures, faster solutions. So, that that piece of it would be my 32nd pitch for open science, right. By making it open by default we can improve rigor and reproducibility and accelerate the discovery of solutions to complex global problems. That's the hope. In the recent past there's been some contentiousness between the science community and the general public, what do you think the scientific community could do to improve their relationship with the public? I guess, yeah, the contentiousness or maybe mistrust or feeling kind of off put in some ways, it's understandable for many reasons. I think the science community, there is certainly a power dimension. And maybe power is not the right word, but kind of an elitism that doesn't help in building trusting relationships with the public at large. Scientists speak, and I am really overgeneralizing because this is – but at large, we in the scientific community speak science better than translating it. And I think that's changing and improving. But I think, more importantly, the science community and those who leverage the outputs of the science community, so media, journalists, public agencies, I think we all have an opportunity to do a better job of explaining science, what it is and what it isn't. Science is a way toward the truth. It isn't intended to be the definitive truth all the time or can't when certain questions are asked. It's a self-corrective and self-critical process of discovery and improvement. And again, we have some confirmation some of the time, but not all the time. And I think a lot of things get misconstrued, misused as definitive. And then when we have to backtrack and say that we learned something new, everyone goes what are you doing, right, and you have big responsibility. So, it's a complex whole process. So, I think we can do a better job of not knowingly or unwillingly wielding the power of science in ways that do the process a disservice in the long run. So, we have to acknowledge what it is and what it isn't and do a better job of explaining it along the way. Thank you so much for that, Lisa. One question that I always ask, how do you keep yourself updated with, say, the industry knowledge or otherwise, as well, in terms of larger trends that are happening around things like research funding, role of governments, the scholarly community, how do you keep yourself updated? It's a tough one, right? Actually, we talk about this as a leadership and management team is we've got no lack of information and insights coming in. It's how do we carve out the time to make sense of it together? And I think I don't know if we've solved for it yet, but probably like many like you and like many of your listeners, I sign up for lots of things, right, so I look for all the organizations that are putting out newsletters or updates or that type of thing. And I get those all coming in into email, and I try to carve out some time to actually thoughtfully look at things, read them. I do take at least an hour a day. I think where I learn the fastest though is it's almost like a journal study, right, where you literally take an article and you have a conversation or take a piece of news and have a conversation about it with colleagues, that's usually the most beneficial in terms of accelerating the sense-making process. But I just encourage the ongoing, kind of convening, the social learning opportunities are so important in addition to just simply having access to everything. So, it's overwhelming sometimes, right. You want to feel like you can keep up with things and it's a lot. So, again, if you have any tips on that one, that would be great. But I think that's going to be there is no lack of information and data and insights that can come towards us. So, just sifting through it, maybe trying not to get overwhelmed. If nothing else, look at the sources that seem to resonate the most with you or will provide you the best inputs and insights for what you need to accomplish. But do try to carve out that time to go a little bit deeper and kind of unplug and try to not apply it directly to what you need to do but think big sometimes as well. I think that's a brilliant insight, Lisa. I agree with you. I think there's no shortage of information. I guess all of us are guilty from time to time to say that definitely let me subscribe to all of these channels, I am going to spend a block of time reading up and I think it doesn't tend to happen always considering how busy our lives are. But like you said, maybe a disciplined approach of spending an hour and then sifting through the most important things that are relevant. And doing it on a regular basis. I guess, I think we are able to kind of get cumulative knowledge over a period of time which perhaps we are also surprised by rather than kind of doing it in a block of time. So, thank you for that. Yeah, the most effective people I've seen are, they are really active, but they are pretty quick to share out bits with their team. And I think that process of sharing requires you to process things a little bit, to think about their relevance, contextualize it, then maybe get some reactions if someone has time. But there are probably other mechanisms to do that even better. So last question, so we can end on something light. Yeah, we are deep, we are trying to solve the world's problem. This was tiring. I’ve clearly heard about your love for wines and so I just want to know, how did that happen? I believe you have a line of wine that you have with your husband. So how did you get started? Is it because you were in California or did California happen because of wine or is there another story to it? I was born and raised in California, in Hollywood actually, born in Hollywood, but we didn't move to California for the grapes and the resultant wine. That happened, my family actually purchased a property in wine country known as Paso Robles. That's in the central coast area of California, smack dab between Los Angeles and San Francisco, and kind of sat on it. This was in the late 80s. And in that region there was a lot of planting of vines. So, it was really picking up. And sat on the property for a while, didn't do anything with it, and then decided around the year 2000 to plant several, many, many acres of Cabernet Sauvignon. And so, a few years after the initial plantings, they put a house on the property. And John, my husband, and I would drive up and just fell in love with the community, met so many winemakers. And it really was a, wouldn’t it be great – so my family was just selling the fruit. They were too smart to get into the wine business. But not us. We loved wine, and we thought, wouldn't it be great if we could produce some wine from the family fruit and see what it does. And so we did. And about 13 years later, we are still doing it under our label. It's very small. It's just the two of us. And we've got a little bit of a network of friends, some winemakers who have been wonderful mentors along the way. And it's just a veritable business, but it's a small one, so we can contain it and be very active. But we always joke that really the impetus was because in the publishing industry, as you know, all good ideas, new products, new services typically happen over a meal, and typically that meal has some wine. So, we kind of married the two activities pretty well. And that's pretty much what happened. That's brilliant. I love wines but I kind of am a relatively newcomer in that area. So, any tips for beginners like me? Yes, just taste, just go out there and taste them. We try to approach this without any snobbery or anything. It's learning. It's a learning experience. And I remember when we were first starting out with kind of wine drinking, and what we drink today versus where we started, very different. But kind of certain things come in waves in terms of your tastes and what you are looking for. But I think just going out and actually being with people who enjoy learning about the process, what's going on. It's kind of a memory game too. You train your palate to understand what it is you are tasting. And then some of it just depends on what foods you like, and other things too. It does matter in terms of do you tend to like certain foods over others. But yeah, I say, just trial and test and taste and that that is my best advice to you. Thank you so much, Lisa. I guess on that note, it's a wrap. But thank you so much, once again, for your time. This was really interesting. Thanks so much, Nikesh. I really enjoyed it and I appreciate what you are doing. So, it’s good fun and I enjoyed it so much. Thank you everyone for joining us, you can subscribe to this podcast on all major podcast platforms. Stay tuned for our next episode.