Home/Tools/@tpmjs/tools-hllm

getExecutionLogs

@tpmjs/tools-hllm

Get execution logs for topology runs.

Official
agent
v0.1.2
MIT
⚠️

This tool is currently broken

Execution Failed
Runtime error with test parameters
Authentication failed: Invalid API key. Ensure HLLM_API_KEY is correct.

Last checked: 1/17/2026, 1:02:17 AM

Interactive Playground

Test @tpmjs/tools-hllm (getExecutionLogs) with AI-powered execution

0/2000 characters

Installation & Usage

Install this tool and use it with the AI SDK

1. Install the package

npm install @tpmjs/tools-hllm
pnpm add @tpmjs/tools-hllm
yarn add @tpmjs/tools-hllm
bun add @tpmjs/tools-hllm
deno add npm:@tpmjs/tools-hllm

2. Import the tool

import { getExecutionLogs } from '@tpmjs/tools-hllm';

3. Use with AI SDK

import { generateText } from 'ai';
import { openai } from '@ai-sdk/openai';
import { getExecutionLogs } from '@tpmjs/tools-hllm';

const result = await generateText({
  model: openai('gpt-4o'),
  tools: { getExecutionLogs },
  prompt: 'Your prompt here...',
});

console.log(result.text);

Parameters

Available configuration options

Auto-extracted
sessionId
Optional
Type: string

Filter logs by session ID.

limit
Optional
Type: number

Maximum number of logs to return.

offset
Optional
Type: number

Number of logs to skip.

Schema extracted: 1/17/2026, 12:59:42 AM

README

ERROR: No README data found!

Statistics

Downloads/month

0

Quality Score

0%

Bundle Size

NPM Keywords

tpmjs
hllm
ai
llm
topology
agent

Maintainers

thomasdavis(thomasalwyndavis@gmail.com)

Frameworks

vercel-ai
getExecutionLogs | TPMJS | TPMJS