Implementing a ChatGPT Bot in Surveys and Behavioral Experiments

2023-03-29

With advances in large language models and OpenAI opening up their API earlier this month, I have been interested in how bots may be embedded within surveys or behavioral experiments. Researchers can understand how participants interact with bots to learn information, or use them as a stimulus. In this article, I describe a quick app I made in oTree that allows us to have participants chat with a bot randomly assigned one of two personalities.

This was inspired by reading a blog post by Max Woolf, describing how one can use the system prompt to provide initial instructions to the API. For instance, here is the prompt I adapted from the GLaDOS prompt from the blog:

embedded

This gives us a person living in Texas (with sterotypical slang) who always tries to direct the conversation to college basketball. Here is an example of a chat with them:

embedded

We can also tweak the prompt to give a snooty art enthusiast from New York:

embedded

So far this has worked pretty well, but I am interested in doing more testing. Importantly, using the API allows us to adjust the temperature, altering creativity in responses. Higher values = more creative responses!

Here is the sample application I created, which randomizes participants to speak to one of these bots. This is built on the oTree platform often used for interactive behavioral experiments. There is a bit of a learning curve to using but it offers a tremendous amount of flexibility. You could for instance test an intervention and see if people interact with the bot differently, quantify their text with NLP on the fly, or embed the bot in a economic game.

There is an extensive tutorial on using oTree on their website. Once you have this all set up, you can clone the app or copy the code into your local oTree repository. The app is currently set to randomly assign participants to prompt A or B, which you can alter directly in the _init_.py file. The constant variables also allow you to set the temperature and the model (currently I have only tested out gpt-3.5-turbo).

embedded

You will also need to sign up for an OpenAI account with a payment option to charge for token use. Once you have an API key, you can add it as an environment variable and this will be imported when working with the application. For hosting the server on Heroku, you can add it with this command (replacing with your key):


heroku config:add CHATGPT_KEY=sk-.....


...and this is imported in the app. I have my API key as an environment in Windows as well for local development. For production, you should also be sure to add the OpenAI package to requirements.txt:


openai==0.27.0


The text data are saved in the participant variables, but there is also a custom export function to retireve all chat logs in long form. This is located under the "data" tab.

Thanks and feel free to provide any feedback!