Docs
AI Runpod

AI Runpod

Let's take a look at how to use ai Runpod.

Deploying to Runpod is a breeze. You can deploy your models on serverless platform, make an endpoint and start making requests to your model in no time. 🚀

Check out the AI Services tutorial to learn how to deploy your custom AI models on Runpod.

Grab your model's endpoint url on Runpod and modify your api/ai/runpod/route.ts to make requests to your model inside your BuouAI project.

 
    let data = JSON.stringify({ // Modify the data object to match the request body of your API
        input: {
          image: base64, 
          prompt: requestBody.prompt,
          seed: requestBody.seed,
          style: requestBody.style,
        },
      });
    
      let config = {
        method: "post",
        maxBodyLength: Infinity,
        url: "#YOUR_RUNPOD_ENDPOINT#", // place in your endpoint here or use environment variables
        headers: {
          Authorization: "Bearer #YOUR_RUNPOD_API_KEY#", // place in your API key here or use environment variables
          "Content-Type": "application/json",
        },
        data: data,
      };
 
 
      ....
 
      try {
        // Custom retry logic to handle failed AI requests
        axiosRetry(axios, {
          retryDelay: (retryCount) => {
            return retryCount * 1;
          },
          retries: 15,
        });
    
        const result = await axios.request(config);
        if (result.status === 200) {
          return NextResponse.json(result.data);
        } else {
          return NextResponse.json(result.data, { status: result.status });
        }
      } catch (error) {
        return NextResponse.json(error.response.data, {
          status: error.response.status,
        });
      }