The generative AI technology powering ChatGPT—OpenAI's GPT-3.5 and GPT-4 LLMs—is available to R users, with a growing collection of packages and apps to choose from.
ChatGPT can answer questions about a wide range of technology subjects, including how to write R code. That means ChatGPT’s power is available to every R programmer, even those who know little about large language models.
An ecosystem is forming around ChatGPT and R, making it easy to incorporate AI technology into your R language workflow. But before you begin using LLMs and related tools for your R projects, there are a few important things to keep in mind:
- Everything you ask using these tools gets sent to OpenAI’s servers. Don’t use ChatGPT tools to process sensitive information.
- ChatGPT may confidently return incorrect answers. Even incorrect responses can be a starting point, and save you time, but don’t assume the code will do exactly what you expect. Kyle Walker (an associate professor at Texas Christian University and author of the popular
tidycensus
R package) tweeted that ChatGPT can “supercharge your work if you understand a topic well,” or it can leave you “exposed for not knowing what you are doing.” The difference is in knowing when the AI output isn’t right. Always check ChatGPT’s responses. - ChatGPT can also generate different responses to the same query—and some answers might be accurate while others aren’t. For instance, when I asked multiple times for a
ggplot2
bar chart with blue bars, the code generated a graph with blue bars sometimes but not others, even though I submitted the same request each time. This is less than ideal if you need a reproducible workflow. - LLMs have training data cutoff dates, so if there’s been a recent update to a package you’re using, your tool of choice may not know about it.
- Most of the resources in this article require you to have your own OpenAI API key, and the API isn’t free to use. While pricing is low at the moment, especially for GPT-3.5, there’s no guarantee it will stay that way.
- Asking ChatGPT for coding help is unlikely to ensnare you in the ethics of AI racial and gender bias. However, there are heated discussions about the wisdom of furnishing OpenAI with yet more data; the ethics of how the training data was scraped and repurposed; and whether it’s better to use open source LLMs (such as Mistral’s) rather than OpenAI’s. Those dilemmas are for every individual and organization to wrangle for themselves. However, as of this writing, there simply aren’t R-specific LLM tools that are comparable to those building up around ChatGPT.
Now, let’s look at some of the most notable R-focused resources for using OpenAI’s GPT-3.5 and GPT-4 LLMs.
8 ChatGPT packages for R
- The air package
- TheOpenAIR package
- RTutor
- CodeLingo
- askgpt
- gptstudio
- gpttools
- gptchatteR
The air package
The air
R package aims to create GitHub Copilot chat-like functionality for R users with two functions: howto()
and whatis()
. You can ask questions about how to write R code for a specific task using howto()
, such as howto("How do I rotate x-axis text on a ggplot2 graph?")
. Answers show up in your R console instead of within your R script, as happens with GitHub Copilot. That might be a bit less convenient. On the other hand, I found the responses quite good when using GPT-4 as my underlying model.
For example, when asking
howto("Create a bar chart in R where the bars are colored steel blue and the bars are ordered by descending Y value")
air
responded with
# Generate example data
set.seed(42)
data <- data.frame(
Category = factor(paste("Category", 1:10)),
Value = sample(1:100, 10)
)
# Order data by Value in descending order
data$Category <- factor(data$Category, levels = data$Category[order(data$Value, decreasing = TRUE)])
# Load ggplot2 package
library(ggplot2)
# Create bar chart
ggplot(data, aes(x = Category, y = Value, fill = "steelblue")) +
geom_bar(stat = "identity") +
scale_fill_identity() +
theme_minimal()
GitHub Copilot gave me
# Create a bar chart in R where the bars are colored still blue and the bars are ordered by descending Y value
ggplot(data = diamonds, aes(x = cut, fill = cut)) +
geom_bar() +
scale_fill_manual(values = "steelblue") +
coord_flip() +
theme_minimal() +
theme(legend.position = "none") +
labs(x = "Cut", y = "Count", title = "Bar Chart of Diamond Cut")
The GitHub Copilot code did not work (scale_fill_manual()
is looking for one color for each category). GitHub Copilot uses an OpenAI Codex model for its responses. Copilot also offers unlimited use for a monthly fee, as does ChatGPT with the GPT-4 model; but using the OpenAI API within an application like this will trigger a charge for each query. Running three or four queries cost me less than a penny, but heavy users should keep the potential charges in mind.
The air
package has excellent and elegant setup instructions on its GitHub README page, including a secure way to store your OpenAI key. The air::set_key()
command triggers a pop-up window for securely storing the key in your system’s key ring. You can also set the OpenAI model you want to use with set_model()
if you don’t want to use the gpt-4
default.
Note that this package is for R-related questions only and will not respond to questions about other programming languages. You don’t have to specify that you want code in R in your questions; I did that in my example to make the question comparable to what I asked GitHub Copilot.
The air
package was created by Professor Soumya Ray at the College of Technology Management, National Tsing Hua University in Taiwan. It is available on CRAN.
TheOpenAIR package
TheOpenAIR package is an excellent choice for incorporating ChatGPT technology into your own R applications, such as a Shiny app that sends user input to the OpenAI API. You can register your key with the openai_api_key(“YOUR-KEY”)
function.
Its chat()
function gives you the option to print results to your console with
chat(“My request”)
, save results as text withmy_results <- chat(“My request”, output = “message”)
, or return a complete API response object withmy_results_object <- chat(“My request”, output = “response object”)
The response object is a list that also includes information like tokens used.
Other useful functions include count_tokens()
to count the number of ChatGPT tokens a character string will cost when sent to the API, extract_r_code()
to get R code from a ChatGPT response that includes a text explanation with code, and get_chatlog_id()
to get the ID of the current ChatGPT (useful if you want to break up a complex application into smaller functions).
The package has some general coding functions, as well. For example, write_code(“filename”)
generates a prompt asking for your input and in what language you want the code written. The refactor()
syntax is R-specific and does what you’d expect:
There are also functions to convert between Python and R or Java and R, although you may end up with a warning message that the conversion from R to Python could result in invalid Python code.
Run help(package = “TheOpenAIR”)
in your R console to see its many other functions.
TheOpenAIR package was developed by Assistant Professor Ulrich Matter and PhD student Jonathan Chassot at the University of St. Gallen in Switzerland. It is available on CRAN.
RTutor
This app is an elegant and easy way to sample ChatGPT and R. Upload a data set, ask a question, and watch as it generates R code and your results, including graphics. Although it’s named RTutor, the app can also generate Python code.
RTutor is available on the web. It’s currently the only app or package listed that doesn’t require a ChatGPT API key to use, but you’re asked to supply your own for heavy use so as not to bill the creators’ account.
The app’s About page explains that RTutor’s primary goal is “to help people with some R experience to learn R or be more productive … RTutor can be used to quickly speed up the coding process using R. It gives you a draft code to test and refine. Be wary of bugs and errors.”
The code for RTutor is open source and available on GitHub, so you can install your own local version. However, licensing only allows using the app for nonprofit or non-commercial use, or for commercial testing. RTutor is a personal project of Dr. Steven Ge, a professor of bioinformatics at South Dakota State University.
CodeLingo
This multi-language app “translates” code from one programming language to another. Available languages include Java, Python, JavaScript, C, C++, PHP and more, including R. This is a web application only, available at https://analytica.shinyapps.io/codelingo . You need to input your OpenAI API key to use it (you may want to regenerate the key after testing).
A request to translate code for a ggplot2
R graph into JavaScript generated output using the rather hard-to-learn D3 JavaScript library, as opposed to something a JavaScript newbie would be more likely to want such as Observable Plot or Vega-Lite.
The request to translate into Python, shown in Figure 3, was more straightforward and used libraries I’d expect. However, ChatGPT didn’t understand that “Set1” is a ColorBrewer color palette and can’t be used directly in Python. As is the case for many ChatGPT uses, translating code between programming languages may give you a useful starting point, but you will need to know how to fix mistakes.
The app was created by Analytica Data Science Solutions.
askgpt
This package, available at https://github.com/JBGruber/askgpt, can be a good starting point for first-time users who want ChatGPT in their console, in part because it gives some instructions upon initial startup. Load the package with library(askgpt)
and it responds with:
Hi, this is askgpt ☺.
• To start error logging, run `log_init()` now.
• To see what you can do use `?askgpt()`.
• Or just run `askgpt()` with any question you want!
Use the login()
function without first storing a key, and you’ll see a message on how to get an API key:
ℹ It looks like you have not provided an API key yet.
1. Go to <https://platform.openai.com/account/api-keys>
2. (Log into your account if you haven't done so yet)
3. On the site, click the button + Create new secret key to create an API key
4. Copy this key into R/RStudio
You’ll be asked to save your key in your keyring, and then you’re all set for future sessions. If your key is already stored, login()
returns no message.
askgpt
‘s default is to store the results of your query as an object so you can save them to a variable like this one:
barchart_instructions <- askgpt("How do I make a bar chart with custom colors with ggplot2?")
Submit a query and you’ll first see:
GPT is thinking ⠴
This way, you know your request has been sent and an answer should be forthcoming—better than wondering what’s happening after you hit Submit.
Along with the package’s general askgpt()
function, there are a few coding-specific functions such as annotate_code()
, explain_code()
, and test_function()
. These will involve cutting and pasting responses back into your source code.
For those familiar with the OpenAI API, the package’s chat_api()
function allows you to set API parameters such as the model you want to use, maximum tokens you’re willing to spend per request, and your desired response temperature (which I’ll explain shortly).
The chat_api()
function returns a list, with the text portion of the response in YourVariableName$choices[[1]]$message$content
. Other useful information is stored in the list, as well, such as the number of tokens used.
The askgpt
package was created by Johannes Gruber, a post-doc researcher at Vrije Universiteit Amsterdam. It can be installed from CRAN.
gptstudio
According to the package website, gptstudio
is a general-purpose helper “for R programmers to easily incorporate use of large language models (LLMs) into their project workflows.” gptstudio
and its sibling, gpttools
(discussed next), feature RStudio add-ins to work with ChatGPT, although it has some command-line functions that will work in any IDE or terminal.
You can access add-ins within RStudio either from the add-in drop-down menu above the code source pane or by searching for them via the RStudio command palette (Ctrl-shift-p).
One add-in, ChatGPT, launches a browser-based app for asking your R coding questions. It offers settings options for things like programming style and proficiency, although I had a bit of trouble getting those to work in the latest version on my Mac.
In the screenshot below, I’ve asked how to create a scatterplot in R.
Although designed for R coding help, gptstudio
can tap into more ChatGPT capabilities, so you can ask it anything that you would the original web-based ChatGPT. For instance, this app worked just as well as a ChatGPT tool to write Python code and answer general questions like, “What planet is farthest away from the sun?”
Another of the gptstudio
package’s add-ins, ChatGPT in Source, lets you write code as usual in your source pane, add a comment requesting changes you’d like in the code, select the block of code including your comment, and apply the add-in. Then, voilà! Your requested changes are made.
When I applied the add-in to the code shown here
I got the correct code back, but it replaced my original code, which might be unsettling if you don’t want your original code erased.
gptstudio
was written by Michel Nivard and James Wade and is available on CRAN.
gpttools
The aim of the gpttools
package “is to extend gptstudio
for R package developers to more easily incorporate use of large language models (LLMs) into their project workflows,” according to the package website. The gpttools
package isn’t on CRAN as of this writing. Instead, you can install gpttools
from the JamesHWade/gpttools GitHub repo or R Universe with the following:
# Enable repository from jameshwade
options(repos = c(
jameshwade = "https://jameshwade.r-universe.dev",
CRAN = "https://cloud.r-project.org"
))
# Download and install gpttools in R
install.packages("gpttools")
The package’s add-ins include:
- ChatGPT with Retrieval
- Convert Script to Function
- Add roxygen to Function (documents a function)
- Suggest Unit Test
- Document Data
- Suggest Improvements
To run an add-in, highlight your code, then select the add-in either from the RStudio Addins dropdown menu or by searching for it in the command palette (Tools > Show Command Palette in the RStudio Addins menu or Ctrl-Shift-P on Windows, or Cmd-Shift-P on a Mac).
When I ran an add-in, I didn’t always see a message telling me that something was happening, so be patient.
The Suggest Improvements add-in generated for this code:
if (exportcsv) {
filename_root <- strsplit(filename, ".")[[1]][1]
filename_with_winner <- paste0(filename_root, "_winners.csv")
rio::export(data, filename_with_winner)
}
returned the following in my console, which made me have to look to see if there were changes:
Text to insert: if (exportcsv) { filename_root <- strsplit(filename, ".")[[1]][1]
filename_with_winner <- paste0(filename_root, "_winners.csv") rio::export(data,
filename_with_winner) }
I tried adding a typo to rio::export()
and it wasn’t fixed, so don’t count on this add-in to fix errors in your code.
gptchatteR
Billed as “an experimental and unofficial wrapper for interacting with OpenAI GPT models in R,” one advantage of gptchatteR
is its chatter.plot()
function.
Install the package with
remotes::install_github("isinaltinkaya/gptchatteR", build_vignettes = TRUE, dependencies = TRUE)
This ensures that it also installs the required openai
package. Then, you can load the package and authenticate with
library(gptchatteR)
chatter.auth("YOUR KEY")
Once that’s done, launch a chat session with chatter.create()
.
The chatter_create()
arguments include a model for the OpenAI model (default is text-davinci-003
), max_tokens
for the maximum number of tokens you want it to use (default is 100), and a “temperature” set with an argument like this one:
chatter.create(temperature = 0)
According to the OpenAI documentation, the temperature setting can be between 0 and 1 and represents “how often the model outputs a less likely token.”
The higher the temperature, the more random (and usually creative) the output. This, however, is not the same as “truthfulness.” For most factual use cases such as data extraction, and truthful Q&A, the temperature of 0 is best.
The package default is a neutral 0.5. Unless you want to be entertained as opposed to getting usable code, it’s worth setting your temperature to 0.
As of when I tested, the package was working but generated this warning:
The `engine_id` argument of `create_completion()` is deprecated as of openai 0.3.0.
ℹ Please use the `model` argument instead.
ℹ The deprecated feature was likely used in the gptchatteR package.
Please report the issue to the authors.
You can create a “casual” chat with chatter.chat("Your input here")
. If you think you’ll want to follow-up after your initial request, use chatter.feed()
, which stores your first query for use in a second question, and so on.
After I ran the following code:
library(gptchatteR)
mydf <- data.frame(State = c("CT", "NJ", "NY"), Pop = c(3605944, 9288994, 20201249))
chatter.auth(Sys.getenv("OPENAI_API_KEY"))
chatter.create(temperature = 0)
chatter.feed('I have the following data in R mydf <- data.frame(State = c("CT", "NJ", "NY"), Pop = c(3605944, 9288994, 20201249))')
myplot <- chatter.plot("Make a graph with State on the x axis and Pop on the Y axis")
a graph appeared in my RStudio view pane. The code was stored in myplot$code
.
The gptchatteR
package was created by Isin Altinkaya, a PhD fellow at the University of Copenhagen.
And one more …
That’s the top eight ChatGPT packages for R. Here’s one more—and I will keep adding to this list, so check back in the future.
chatgptimages
wasn’t designed to help you code. Instead, it uses a familiar R and Shiny interface to access another ChatGPT capability: creating images. There are a number of ethical intellectual property issues currently tangled up in AI image creation based on what was used to train models, which is important to keep in mind if you want to use this package for anything beyond entertainment.
That said, if you’d like to give it a try, note that it doesn’t install like a usual package. First, make sure you also have shiny
, golem
, shinydashboard
, openai
, config
, and testthat
installed on your system. Then, fork and download the entire GitHub repo at https://github.com/analyticsinmotion/chatgpt-images-r-shiny or download and unzip the .zip file from https://github.com/analyticsinmotion/chatgpt-images-r-shiny. Open the chatgptimages.Rproj
file in RStudio, open the run_dev.R
file in the project’s dev folder, and run that short file line by line. This app should open in your default browser:
Follow the instructions on storing a ChatGPT API key, and you can start creating and saving images.
The results look something like what’s shown in Figure 6.
Beyond ChatGPT
If you’d like to test out other large language models that are open source, one non-R-specific tool, Chat with Open Large Language Models, is interesting. It offers access to 20 different models as of this writing and an “arena” where you can test two at once and vote for the best.
Be aware of the terms of use: “non-commercial use only. It only provides limited safety measures and may generate offensive content. It must not be used for any illegal, harmful, violent, racist, or sexual purposes. The service collects user dialogue data for future research.”
As a final note, H2o.ai has a website where you can test models. There are also numerous models available for testing at Hugging Face.