Fuzzy purple monster in blue room, ai generated
Fun with AI

Nerd Rating:

Using the Dalle-E 2 API with NextJS - Without Third Party Libraries

0 min read
Author's profile picture

Karen Turner


Dall-E 2 is the upgraded version of OpenAI's revolutionary AI image generation model. It can create unique and relevant images based on textual prompts. In this post, we will demonstrate how to integrate Dall-E 2 API into a Next.js application without using third party libraries.

I have been playing around with AI a lot lately and recently implemented Dall-E 2 image generation in a Nextjs application I am developing. After referencing the source documentation and researching other sources, I found that all examples used the openai library. While this works just fine, I typically try to avoid third party libraries as much as possible. This approach allows for faster load times, better performance, and a more streamlined development process, as I can focus on implementing only the features I am looking for, and can take advantage of updated versions of my favorite technologies as soon as I am ready.


  1. Basic understanding of JavaScript and Next.js
  2. An API key from OpenAI to access the Dall-E 2 API
  3. Node.js and npm installed

Steps to Integrate Dall-E 2 API with Next.js

  1. Create a new Next.js project:
npx create-next-app dall-e-nextjs-demo
cd dall-e-nextjs-demo
  1. Create an .env.local file in your project root directory to store your API key:
  1. Create a utils directory and a new file dall-e-api.js inside it:
mkdir utils
touch utils/dall-e-api.js
  1. Add the following code to dall-e-api.js: const fetch = require('node-fetch');
const DALL_E_API_URL = 'https://api.openai.com/v1/images/generations';

const generateImage = async (prompt) => {
  try {
    const response = await fetch(DALL_E_API_URL, {
      method: 'POST',
      headers: {
        'Authorization': `Bearer ${process.env.NEXT_PUBLIC_DALL_E_API_KEY}`,
        'Content-Type': 'application/json',
      body: JSON.stringify({
        n: 1,
        size: "512x512",

    const data = await response.json();
    return data.data[0].url;
  } catch (error) {
    console.error('Error generating image:', error);
    return null;

export default generateImage;
  1. Modify the pages/index.js file:
import { useState } from 'react';
import generateImage from '../utils/dall-e-api';

const HomePage = () => {
  const [prompt, setPrompt] = useState('');
  const [imageUrl, setImageUrl] = useState(null);

  const handleSubmit = async (e) => {
    const url = await generateImage(prompt);

  return (
      <h1>Dall-E 2 API with Next.js</h1>
      <form onSubmit={handleSubmit}>
        <label htmlFor="prompt">Image Prompt:</label>
          onChange={(e) => setPrompt(e.target.value)}
        <button type="submit">Generate Image</button>
      {imageUrl && (
          <h2>Generated Image:</h2>
          <img src={imageUrl} alt="Generated by Dall-E 2" />

export default HomePage;
  1. Start the development server:
npm run dev


Now you have successfully integrated Dall-E 2 API with your Next.js application without using the OpenAI library. Users can enter a prompt, and the application will display an image generated by Dall-E 2. This approach allows for greater flexibility and customization based on your specific needs, while also reducing dependencies on third-party libraries. The use of the native fetch API provides a lightweight alternative to Axios or Express, ensuring your application remains lean and performant.

Fuzzy purple monster in pink room
Generated by Dall-E 2
  1. Newer Posts
  2. Page 4 of 9
  3. Older Posts