Skip to main content
In this tutorial, you will learn how to use the credentials from an App you’ve created in Tyk AI Studio to build a simple command-line interface (CLI) application with NodeJS. This application will take a text prompt from you, send it to an LLM via the Tyk AI Gateway, and print the response.

Prerequisites

Before you begin, you will need:
  • An approved App in Tyk AI Studio: You should have already created an App and had it approved by a Studio Administrator. If you haven’t, please follow the guide on App Management for Consumers.
  • Your App’s Secret: You will need these to authenticate your application.
  • Node.js and npm installed: This tutorial uses NodeJS. You can download it from nodejs.org.
  • An LLM configured in AI Studio: Your App must be configured to use at least one LLM. This tutorial will use OpenAI LLM models.

Instructions

Step 1: Set up your NodeJS project

First, create a new folder for your project and initialize a new NodeJS project within it.
mkdir tyk-ai-cli-app
cd tyk-ai-cli-app
npm init -y
npm install ai @ai-sdk/openai

Step 2: Create the CLI application file

Create a new file named index.js in your project folder. This is where you will write the code for your CLI application.
touch index.js

Step 3: Write the application code

Open index.js in your favorite code editor and add the following code. Be sure to replace the placeholder values with your actual App credentials and AI Gateway endpoint.
Expandable
import readline from "node:readline";
import { generateText } from "ai";
import { createOpenAI } from "@ai-sdk/openai";

// ---- CONFIG ----
const apiKey = "YOUR_API_KEY";
const baseURL = "YOUR_TYK_AI_STUDIO_GATEWAY_URL"; 

// Create OpenAI-compatible client
const openai = createOpenAI({
  apiKey,
  baseURL
});

// CLI interface
const rl = readline.createInterface({
  input: process.stdin,
  output: process.stdout,
  prompt: "You> "
});

console.log("LLM CLI started. Type a prompt and press enter.\n");
rl.prompt();

rl.on("line", async (line) => {
  const prompt = line.trim();

  if (!prompt) {
    rl.prompt();
    return;
  }

  try {
    const { text } = await generateText({
      model: openai("gpt-4o-mini"),
      prompt
    });

    console.log("\nLLM>", text, "\n");
  } catch (err) {
    console.error("Error:", err.message);
  }

  rl.prompt();
});

Step 4: Run your CLI App

Now you can run your application from your terminal.
node index.js
Example:
LLM CLI started. Type a prompt and press enter.

You> explain kubernetes in simple terms

LLM> Kubernetes is a system that helps run and manage containers...
Congratulations! You have successfully used your Tyk AI Studio App credentials to build a working AI-powered CLI application.

Frequently Asked Questions

A 401 error means your request is not properly authenticated. Check the following:
  • Make sure your App has been approved by an administrator and its credentials are active.
  • Double-check that you have correctly copied your Secret into the index.js file.
Yes, you can use any LLM that you have configured in your App.