Significant Impact: from K Award to Your First Big R01

Tempted to Use AI to Write Your Grant? Here's What To Consider

Sarah Dobson Episode 81

The ever-expanding presence of artificial intelligence in our professional lives has reached the sacred territory of research grant writing, raising profound questions about scientific integrity and the nature of academic work. As someone who's spent a fair amount of time exploring these tools firsthand, I'm sharing my perspective on how researchers can navigate the use of these tools without compromising what makes their work valuable.

Grant writing isn't just an administrative task standing between you and your research—it's a fundamental component of the scientific process where ideas crystallize and projects take shape. When we frame grant writing this way, the appeal of AI shortcuts gets more complicated. 

What exactly are you saving yourself time from when you outsource this critical thinking work? The pressure to produce more grants in less time stems from structural problems in academia that technology alone cannot solve. 

Most researchers face overwhelming institutional expectations to secure funding while simultaneously teaching, providing service, and for many, delivering clinical care. The solution involves setting boundaries and clarifying priorities rather than finding technological workarounds. Even practically speaking, today's AI tools remain unreliable for academic writing—producing fabricated citations and requiring extensive verification that may actually increase your workload. 

What makes your research valuable is your unique perspective, critical analysis, and deep understanding of your field—the elements that cannot and should not be delegated to a large language model. AI may be here to stay, but let's ensure that it enhances rather than diminishes the human intelligence at the heart of scientific advancement.

For more content on grant writing techniques, visit my YouTube channel where I share technical advice on NIH grant writing (or just google "Sarah Dobson NIH grant writing tips" and that should get you where you want to go)

Speaker 1:

this episode is a bit of a departure from what we typically talk about on the podcast, but I think it's a really important topic to bring into our conversations about the k-r transition and how to, about designing your professional life. So this episode is taken from a recent YouTube video that I posted on my channel and, yes, I do have a YouTube channel. If you're not aware, the YouTube channel is, I would say, much more tips and tricks and tools for grant writing, specifically NIH grant writing. So it's a lot more, let's say, technical than what I talk about on the podcast here, which I would say is more career oriented and mindset oriented. So if you're interested in looking at some of the more, so if you're interested in looking at some of the more technical and sort of nuts and bolts how do you write a grant type content? You can definitely find that over on my YouTube channel and I will put a link to that in the show notes so that you can find it. But really you could just Google Sarah Dobson, nih grant writing tips, and I'm pretty sure that will work. So, yes, I have a YouTube channel and I have recently done a series on using AI in research grant writing and I have, I think, fairly conservative views on using AI in grant writing, and that comes after I've been playing around with using AI tools to just see what they're capable of, how I can use them and the amount of training that I need to use them.

Speaker 1:

Well, and so I did a series because to me, it's unavoidable now. It's unavoidable that these tools are around us. It seems like they're being forced upon us in a lot of ways. You can't get away from them. On search engines and in virtually every app that you're using, there's some sort of AI assistant. And so it felt long past time to talk about this in the context of research grant writing, not only because I've been getting a lot of questions about it from students and clients and just inquiries from people who aren't really in my community just wanting to know. And the reason that I have delayed doing this series for so long is because I wanted to make sense of it for myself first and really again, really see what these tools can do and how they can be used responsibly and ethically. And so I wanted to share with you the first video that I did in this YouTube series, where I talked about using these tools ethically and responsibly, and the reason that I think it's important to bring into the podcast, where we talk about the K to R transition, we talk about designing your career, we talk about how to how to prioritize your own work and your own impact.

Speaker 1:

The reason that I wanted to bring this AI conversation here is because my sense of what these tools are, what makes these tools really appealing is using them to speed up the grant writing process and using them to get around a problem that has nothing to do with grant writing. It has everything to do with how you organize your time and your schedule and your commitments and your responsibilities, and that is something that I talk about in this video and, of course, something that is highly relevant to what we talk about here on the podcast. Right, this is something we've talked about over and over and that are placed on you by your institution and the requirements to churn grants out with no regard to quality, only quantity. Right, that AI can't solve that problem. And if you look to AI to solve that problem, you are just handing over all of your original thinking, your original ideas, your ability to think critically and your ability to engage with the peer-reviewed literature, and I think that is extremely dangerous, and so, yeah, I may have very conservative views about AI in grant writing, but I don't I'm not going to apologize for it. I think, certainly at the moment, with where the tools are at now, and obviously we are only at the beginning, these tools are only going to improve at the beginning. These tools are only going to improve, but, given where we're at now, in my view these tools are appealing to researchers because they solve a problem that cannot be solved by an AI tool, and so I just want us all to be very careful about that. So that is a long enough intro to this episode. I will hand it over to the YouTube version of myself so that you can you can hear the rest of the episode, but if you found this interesting, if you are curious about this, I encourage you to view the whole series over on YouTube. But if you want me to talk more about this on the podcast, just send me an email. Let me know what you think. I would love to hear from you. All right, let's get into the episode.

Speaker 1:

Welcome to the Significant Impact Podcast, the show dedicated to helping women faculty convert their NIH Career Development Award into their first big R01. This period in your career is such an important turning point, and it's a crucial opportunity to design the kind of research career that really works for you so that you're able to write and lead these big career-fueling research project grants and lead these big career-fueling research project grants. It's not easy to figure out what you really want when you have so many different voices in your ear telling you what to do and how to do it, but it is possible to design a career that's fulfilling and meaningful to you while also securing enough grant funding to sustain your lab and make an impact with your research. That's what we're talking about here. On Significant Impact, with me, sarah Dobson, nih grant consultant and academic career coach, tune in for an honest look at what it really takes to be successful in the world of NIH grant funding. Start thinking differently about what an academic career looks like, one that's driven by purpose and curiosity and a healthy dose of disruptive energy.

Speaker 1:

Ai tools have been integrated into so many of our daily activities, professional and personal, that it can seem almost impossible to avoid them at this point, and so that's why I wanted to talk about the ethical and responsible use of AI in research grant writing. So I have been planning to do a series on AI in research grant writing for at least a year now. Time to observe and use some of the tools myself to see what the challenges, pitfalls and opportunities are with these tools and where we need to be really careful, especially when it comes to grant writing in research. So before we dive into some of those considerations, I first want to talk about my philosophy for grant writing and how it informs my philosophy on AI in grant writing, because I think that's really important to set the stage for the conversation and, obviously, for you to decide whether you agree with me or not on this right.

Speaker 1:

I think it's fair to say that I am cautious and conservative about the use of these tools in research grant writing, for very good reason, but let me explain why. So if you have watched any of the videos on this channel or if you have subscribed to my newsletter and if you haven't subscribed to my newsletter, I highly recommend that you do that that's where all of my best stuff is. So if you are familiar with me in either of those ways, you will know that my core philosophy when it comes to grant writing is that it's your job to communicate the value of your research to your reviewer, and alongside that is the idea that your research is a busy, stressed out, tired human who is volunteering to review a bunch of grants for scientific and technical merit, and your job as an applicant is to make it as easy as possible for your reviewer to see the value in the research that you're doing. And when we think about AI in that context, I mean unless and until AI becomes part of the scientific review process, that is a different conversation, that is a different story and you can just ignore everything that I've said here. But right now, scientific merit review still happens through a peer review process and it is my opinion and also my experience that the most compelling applications, the most successful applications, are the ones that generate enthusiasm among your reviewers. And that is a blend of having a really strong research idea but also how you communicate the value and importance of that research idea. So that's the first sort of core philosophy that's underpinning how I'm seeing the use of AI tools in research grant writing.

Speaker 1:

The other philosophy that I have that's really important to this conversation is the idea that writing a research grant is very much part of the research process. It is not an administrative hurdle that you have to overcome in order to be able to do your research. Writing a grant is what happens at the early stages, where you are sharpening your thinking, where you are planning and designing your project, where you are anticipating pitfalls and ensuring that you have alternative strategies right, it's all of the work that needs to be done to set yourself up for success once the project is underway, right? And so assuming that it's just a bunch of checkboxes that you need to hand over to a funder to be able to get that money to do your research is the wrong way to think about it, in my opinion. Right, you can disagree with me if you want to, but then you know why are you here, right?

Speaker 1:

So this is how I see grant writing is that it is a core piece of the work that you do as a scientist, especially when you consider it in those sort of early planning stages and the deep thinking that needs to be done to do it well. And of course, there's the communication aspect, which comes into play when you are preparing publications for peer-reviewed journals, right, so you're getting early practice with that. When you are preparing a grant application, that can serve you really well when you're writing up your findings to submit to a journal, right? So, for all of those reasons, grant writing is a core part of the work that you do as a scientist and should be treated as such. And so, with those two foundational pieces in mind, let's talk a little bit about how that comes into play when we think about using AI tools in grant writing.

Speaker 1:

And I know that it can be really tempting to use these tools because you think it's going to save you time, but I want to challenge that assumption in two important ways. Number one is save you time from what? Right? So I just explained my philosophy about grant writing being part of the research process, right? And so, if we think about grant writing in that way, what are you actually saving time from if you are outsourcing this work to an AI tool? And it's important to consider that, because most of the academics I know I'm sure you know all of the academics I know are under tremendous pressure to fulfill the expectations that their institutions have for them around research and teaching and service, and, you know, clinical work, if that applies right.

Speaker 1:

And so, again, it can be really, really tempting to find shortcuts to be able to do more work in less time. But what is the problem that you're actually solving here if you're outsourcing some of the core work that you need to be doing as a researcher? And, in my view, outsourcing that core work is solving the wrong problem. And the problem is not even a time management problem, it's a structural problem. It's those expectations that your institution has placed on you around securing grant funding while also doing all of these other things, all of these other things. And so to solve that problem requires something different. It requires clear priorities, a clear understanding of what your value and impact is, and it requires some boundary setting. And again, I'm sure it does seem easier to just use an AI tool as a shortcut to avoid doing that more difficult and even emotional work. But again, if you outsource your thinking to an AI tool, what is left for you as a researcher? So I want you to consider that very carefully and challenge the assumption that you have around these tools saving you time, because what is it actually saving you from?

Speaker 1:

The second challenge that I want to suggest when it comes to this idea of an AI tool saving you time in the grant writing process is is it actually going to save you time? Right? We, you know, at the time of this recording, we are still very much in the stone age of AI tools, right, and who knows how long that will last. The developments and improvements are happening very, very quickly, but at the moment, these tools are still pretty unreliable when it comes to summarizing and interpreting and even producing citations, right, and you know very famously that has happened recently in an HHS report where there were hallucinated citations, right. So that is still very, very common in terms of the output from most of these AI tools that are available at the moment.

Speaker 1:

And so if you are planning to use one of these tools to assist you in the grant writing process, to do that responsibly, to do it ethically, you are going to have to double and triple check all of those outputs. Double and triple check all of those outputs, and my question to you is is that actually going to save you time compared to writing it out yourself? So, again, I think that this idea that these tools can save us time need to be challenged before we can think about using them ethically and responsibly. So I will leave you with that and then let's talk quickly about some of the considerations around using these tools ethically and responsibly, and we're just going to do this very briefly at a very high level, because we will get into these in more depth throughout the rest of this series.

Speaker 1:

So number one and this should go without saying, but I'm going to say it anyway is that you need to make sure that you are in compliance with your institutional guidelines and the funders guidelines around the use of AI. So, for example, at NIH right now, there are guidelines around the use of AI in the peer review process, but not for the development of a research grant. But there are lots of guidelines and regulations about original research. Right, your institution might have its own rules and guidelines around the use of AI in grant writing, and so do not pass go until you are clear about what those guidelines are and what that allows you to accomplish with an AI tool. So that's number one.

Speaker 1:

Number two is that, if we think of writing more broadly than just generating text, and AI can be used ethically and responsibly in some aspects, right? So, for example and I think one of the most useful ways and useful examples here is in developing timelines and organizational structure for actually writing your grant. So remember that writing a research grant is a big project in and of itself, right. The actual research project, of course, requires project management, but the actual writing of the grant requires project management, and so you can use a tool like ChatGPT to help you develop a project management plan for your grant writing that considers your existing schedule, commitments and so on and so forth. Right, so you can create some efficiency and accountability using an AI tool for some of those sort of peripheral activities beyond sort of the output of writing that can actually make your writing more efficient, but that don't compromise the integrity or the proprietary nature of any of the writing that you're actually doing. So I think there's a really good example. Another one that I've used in previous videos is using an AI tool to help you identify funding opportunities that you may not have considered, whether that's at the local, state or national level, especially if you are looking beyond federal funding sources, you might be able to use an AI tool to help you identify some alternative or new sources of funding that you haven't considered before, some alternative or new sources of funding that you haven't considered before. So, again, if we think about writing more broadly, if we think about grant writing and some of the other activities that are involved in grant writing AI tools can be useful without sort of again compromising the integrity of the actual writing output or the input that you need to provide to get anything useful from these AI tools.

Speaker 1:

Number three, and this is sort of in some ways reiterating what we talked about in number one, and that is really protecting confidentiality and intellectual property. So remember that most of these tools require that you provide inputs for them to use, and those inputs, in the case of most research grants, would involve original work, proprietary work, and that these corporations are retaining that original work and proprietary work and using it to train these models. And so you've just handed over your original work to a corporation that is using it to train its models. And so, again, use at your own risk or don't use at your own risk, right, but just be really careful in thinking about how you're using these tools and whether that is, first of all, permitted by your institution and whether that is something that you want to be doing. And there are, of course, ways within a lot of these tools to prevent them from using that material to train the AI tool. However, the rules and the guidelines and the legality of that is still a bit nebulous, because we don't know how clear those guidelines are around protecting that information from being used for training.

Speaker 1:

And so the last thing is to use AI for refinement and clarity rather than to solve the blank page problem, and clarity rather than to solve the blank page problem. So in a lot of industries, ai is used to solve that blank page problem. So you have a blank page, you're not sure where to start and you can use AI to help you kind of get started with a writing project. But in my view, that is the absolute wrong way to use AI for research grant writing for all the reasons that we've talked about. Right, you want to retain that original deep thinking as the work that only you can do, and so you would use AI more on the refinement and clarity side of things, but again, very judiciously, very carefully and in a really piecemeal way.

Speaker 1:

And if you still have that blank page problem, that can be solved in other ways. So, for example, in my grant funding formula program we have templates that students can use that are really kind of fill in the blank to help you get started on clarifying the description of your research idea and the objectives and aims that you have right. So using prompts to help you start thinking in that way doesn't require an AI tool. There are lots of resources available to you that can help you sort of prompt yourself, interrogate yourself, to get that information out of you in ways that, again, don't compromise your ability to think, which is one of your core functions as a researcher, right? So all of those elements need to be considered to use AI ethically and responsibly going forward.

Speaker 1:

And again, this is the beginning of the conversation, not the end. We will continue this conversation in subsequent videos in this series and we will revisit this series again and again over time to refine our thinking on how to use AI in research grant writing in an ethical and responsible way. Thanks for listening to this episode of Significant Impact from K Award to your first big R01. If you want to dig deeper into what we learned today and move a significant step closer to a smooth K-R transition, visit sarahdobsonco slash pod and check out all the free stuff we have to help you do just that. Don't forget to subscribe to the show to make sure you hear new episodes as soon as they're released, and if today's episode made you think of a colleague or a friend, please tell them about it. Tune in next time and thanks again for listening.