Skip to main content

Overview

Build a minimal Next.js application using the App Router that accepts a prompt, routes requests through the ModelRunner server proxy, and displays the generated output.
You will keep your ModelRunner key on the server and route all client requests through the proxy. The client never sees secrets.

Prerequisites

  • Node.js 18+
  • Next.js 13.4+ (App Router)
  • ModelRunner account and API key
Find your key in your ModelRunner account. Store it as MODELRUNNER_KEY in your environment.

Steps

1

Create a Next.js app

Use the official Next.js starter with the App Router.
npx create-next-app@latest nextjs-modelrunner --ts --eslint --app --src-dir --tailwind
cd nextjs-modelrunner
2

Install the ModelRunner client and proxy

npm i @modelrunner/client @modelrunner/server-proxy
3

Add environment variable

Create a .env.local file at the project root.
MODELRUNNER_KEY=your_modelrunner_key
Do not commit .env.local. Next.js ignores it by default via .gitignore.
4

Add the proxy route (App Router)

Create a route under app/api/modelrunner/proxy/route.ts that re-exports the proxy handlers.
// app/api/modelrunner/proxy/route.ts
import { route } from "@modelrunner/server-proxy/nextjs";

export const { GET, POST, PUT } = route;
5

Configure the client to use the proxy

Set the proxy URL so client calls route through your server. Place this config in a client-side entry point (e.g., app/page.tsx) before making calls.
import { modelrunner } from "@modelrunner/client";

modelrunner.config({
  proxyUrl: "/api/modelrunner/proxy",
});
6

Build the UI: simple prompt form

Replace the home page with a form that calls the client (which uses the proxy) and renders results.
// app/page.tsx
'use client'
import { useState } from "react";
import { modelrunner } from "@modelrunner/client";

modelrunner.config({
  proxyUrl: "/api/modelrunner/proxy",
});

export default function HomePage() {
  const [prompt, setPrompt] = useState("");
  const [loading, setLoading] = useState(false);
  const [error, setError] = useState<string | null>(null);
  const [result, setResult] = useState<any>(null);

  async function onSubmit(e: React.FormEvent) {
    e.preventDefault();
    setLoading(true);
    setError(null);
    setResult(null);
    try {

      const response = await modelrunner.run("bytedance/sdxl-lightning-4step", {
        input: {
          prompt,
          negative_prompt: "worst quality, low quality, deformed, extra fingers",
          width: 1280,
          height: 1024,
          num_outputs: 1,
          scheduler: "K_EULER",
          num_inference_steps: 4,
          guidance_scale: 0,
          seed: 103,
          disable_safety_checker: "false",
        },
      });

      setResult(response.data);
    } catch (err) {
      setError(err instanceof Error ? err.message : "Unknown error");
    } finally {
      setLoading(false);
    }
  }

  return (
    <main className="mx-auto max-w-2xl p-6 space-y-6">
      <h1 className="text-2xl font-semibold">ModelRunner Image Generation</h1>
      <form onSubmit={onSubmit} className="space-y-4">
        <label className="block">
          <span className="text-sm font-medium">Prompt</span>
          <input
            className="mt-1 w-full rounded-md border p-2"
            placeholder="two friends cooking together"
            value={prompt}
            onChange={(e) => setPrompt(e.target.value)}
          />
        </label>
        <button
          type="submit"
          className="rounded-md bg-blue-500 px-4 py-2 text-white disabled:opacity-50"
          disabled={loading || !prompt}
        >
          {loading ? "Generating..." : "Generate"}
        </button>
      </form>

      {error && (
        <div className="rounded-md border border-red-300 bg-red-50 p-3 text-red-700">
          {error}
        </div>
      )}

      {result && (
        <section className="space-y-2">
          <h2 className="text-lg font-medium">Result</h2>
          <div className="text-sm text-gray-700">
            <div>
              <span className="font-semibold">Status:</span> {result?.status}
            </div>
            {typeof result?.inferenceTime === "number" && (
              <div>
                <span className="font-semibold">Inference time:</span> {result.inferenceTime}s
              </div>
            )}
          </div>

          {Array.isArray(result?.output) ? (
            <div className="grid grid-cols-1 gap-3 sm:grid-cols-2">
              {result.output.map((item: any, index: number) => {
                const url = typeof item === "string" ? item : item?.url || item?.image_url;
                return url ? (
                  // eslint-disable-next-line @next/next/no-img-element
                  <img key={index} src={url} alt={`Generated ${index + 1}`} className="rounded-lg" />
                ) : (
                  <pre key={index} className="overflow-auto rounded-md bg-gray-100 p-3 text-sm">
                    {JSON.stringify(item, null, 2)}
                  </pre>
                );
              })}
            </div>
          ) : result?.output ? (
            <pre className="overflow-auto rounded-md bg-gray-100 p-3 text-sm">
              {JSON.stringify(result.output, null, 2)}
            </pre>
          ) : (
            <pre className="overflow-auto rounded-md bg-gray-100 p-3 text-sm">
              {JSON.stringify(result, null, 2)}
            </pre>
          )}
        </section>
      )}
    </main>
  );
}
7

Run the app

npm run dev
Visit http://localhost:3000. Enter a prompt and submit. You should see the API response or an image.

Verification and troubleshooting

  • 401 Unauthorized: Confirm MODELRUNNER_KEY is set and your server restarted.
  • CORS issues: Calls are server-to-server. Ensure your browser calls /api/modelrunner, not the external API directly.
  • Unexpected response shape: Log result and adjust rendering. Some models return arrays or nested objects.
You can call the client from a Server Action for form submissions. Ensure modelrunner.config and secret usage remain server-only.

Reference

See the JavaScript client guide for more usage patterns.