Home/Tools/@tpmjs/tools-logistic-regression

logisticRegressionTool

@tpmjs/tools-logistic-regression

Perform binary logistic regression using gradient descent. Fits a model to predict binary outcomes (0 or 1) from feature variables. Returns coefficients, predictions, and accuracy metrics.

Official
statistics
v0.2.0
MIT

Interactive Playground

Test @tpmjs/tools-logistic-regression (logisticRegressionTool) with AI-powered execution

0/2000 characters

Installation & Usage

Install this tool and use it with the AI SDK

1. Install the package

npm install @tpmjs/tools-logistic-regression
pnpm add @tpmjs/tools-logistic-regression
yarn add @tpmjs/tools-logistic-regression
bun add @tpmjs/tools-logistic-regression
deno add npm:@tpmjs/tools-logistic-regression

2. Import the tool

import { logisticRegressionTool } from '@tpmjs/tools-logistic-regression';

3. Use with AI SDK

import { generateText } from 'ai';
import { openai } from '@ai-sdk/openai';
import { logisticRegressionTool } from '@tpmjs/tools-logistic-regression';

const result = await generateText({
  model: openai('gpt-4o'),
  tools: { logisticRegressionTool },
  prompt: 'Your prompt here...',
});

console.log(result.text);

Parameters

Available configuration options

Auto-extracted
x
Required
Type: array

Feature matrix where each row is a sample and each column is a feature

y
Required
Type: array

Binary target labels (must be 0 or 1)

iterations
Optional
Type: number

Number of gradient descent iterations (default: 1000)

learningRate
Optional
Type: number

Learning rate for gradient descent (default: 0.1)

Schema extracted: 1/1/2026, 8:18:23 AM

README

Logistic Regression Tool

Simple binary logistic regression implementation using gradient descent optimization.

Installation

npm install @tpmjs/tools-logistic-regression

Usage

import { logisticRegressionTool } from '@tpmjs/tools-logistic-regression';

// Example: Predict binary outcome from features
const result = await logisticRegressionTool.execute({
  x: [
    [1.0, 2.0],
    [2.0, 3.0],
    [3.0, 4.0],
    [4.0, 5.0],
  ],
  y: [0, 0, 1, 1],
  iterations: 1000,
});

console.log(result);
// {
//   coefficients: [0.5, 0.3, -0.2], // [intercept, feature1, feature2]
//   predictions: [0, 0, 1, 1],
//   accuracy: 1.0,
//   iterations: 1000,
//   convergence: {
//     finalLoss: 0.123,
//     converged: true
//   }
// }

API

Input

  • x (required): Feature matrix number[][] where each row is a sample
  • y (required): Binary labels number[] (must be 0 or 1)
  • iterations (optional): Number of gradient descent iterations (default: 1000)
  • learningRate (optional): Learning rate (default: 0.1)

Output

  • coefficients: Model weights including intercept
  • predictions: Binary predictions for each sample
  • accuracy: Classification accuracy (0 to 1)
  • iterations: Number of iterations performed
  • convergence: Loss and convergence status

Algorithm

Uses gradient descent to minimize binary cross-entropy loss:

  1. Initialize coefficients to zero
  2. For each iteration:
    • Calculate predictions using sigmoid function
    • Compute gradient of loss function
    • Update coefficients: θ = θ - α∇L
  3. Return fitted model

The sigmoid function maps linear combinations to probabilities [0, 1].

License

MIT

Statistics

Downloads/month

0

Quality Score

0%

Bundle Size

NPM Keywords

tpmjs
statistics
machine-learning
logistic-regression
classification

Maintainers

thomasdavis(thomasalwyndavis@gmail.com)

Frameworks

vercel-ai