Skip to content

HopfieldHopfield

VersionVersionDownloads per monthDownloads per monthMIT LicenseMIT LicenseGitHub Repo starsGitHub Repo stars

Hopfield is a Typescript-first large language model framework with static type inference, testability, and composability. Easily validate LLM responses and inputs with strong types. Flexible abstractions with best practices baked in.

Add it to your project, along with any peer dependencies:

npm i hopfield

ready, set, hop

See how easy it is to add composable, type-safe LLM features with Hopfield:

ts
ts
import hop from "hopfield";
import { chat } from "./openai";
 
const incomingUserMessage = "How do I reset my password?";
 
const messages: hop.inferMessageInput<typeof chat>[] = [
{
content: incomingUserMessage,
role: "user",
},
];
 
const parsed = await chat.get({
messages,
});
 
if (parsed.choices[0].__type === "function_call") {
(property) __type: "function_call" | "stop" | "length" | "content_filter"
const category = parsed.choices[0].message.function_call.arguments.category;
await handleMessageWithCategory(category, incomingUserMessage);
const category: "ACCOUNT_ISSUES" | "BILLING_AND_PAYMENTS" | "TECHNICAL_SUPPORT" | "FEATURE_REQUESTS" | "BUG_REPORTS" | "PRODUCT_INQUIRIES" | "PASSWORD_RESET" | "SECURITY_ISSUES" | "SERVICE_OUTAGES" | ... 22 more ... | "OTHERS"
}
ts
import hop from "hopfield";
import { chat } from "./openai";
 
const incomingUserMessage = "How do I reset my password?";
 
const messages: hop.inferMessageInput<typeof chat>[] = [
{
content: incomingUserMessage,
role: "user",
},
];
 
const parsed = await chat.get({
messages,
});
 
if (parsed.choices[0].__type === "function_call") {
(property) __type: "function_call" | "stop" | "length" | "content_filter"
const category = parsed.choices[0].message.function_call.arguments.category;
await handleMessageWithCategory(category, incomingUserMessage);
const category: "ACCOUNT_ISSUES" | "BILLING_AND_PAYMENTS" | "TECHNICAL_SUPPORT" | "FEATURE_REQUESTS" | "BUG_REPORTS" | "PRODUCT_INQUIRIES" | "PASSWORD_RESET" | "SECURITY_ISSUES" | "SERVICE_OUTAGES" | ... 22 more ... | "OTHERS"
}
ts
ts
import hop from "hopfield";
import openai from "hopfield/openai";
import OpenAI from "openai";
import z from "zod";
 
const hopfield = hop.client(openai).provider(new OpenAI());
 
const categoryDescription = hopfield
.template()
.enum("The category of the message.");
 
const classifyMessage = hopfield.function({
name: "classifyMessage",
description: "Triage an incoming support message.",
parameters: z.object({
summary: z.string().describe("The summary of the message."),
category: SupportCategoryEnum.describe(categoryDescription),
const categoryDescription: "The category of the message. This must always be a possible value from the `enum` array."
}),
});
 
export const chat = hopfield.chat().functions([classifyMessage]);
ts
import hop from "hopfield";
import openai from "hopfield/openai";
import OpenAI from "openai";
import z from "zod";
 
const hopfield = hop.client(openai).provider(new OpenAI());
 
const categoryDescription = hopfield
.template()
.enum("The category of the message.");
 
const classifyMessage = hopfield.function({
name: "classifyMessage",
description: "Triage an incoming support message.",
parameters: z.object({
summary: z.string().describe("The summary of the message."),
category: SupportCategoryEnum.describe(categoryDescription),
const categoryDescription: "The category of the message. This must always be a possible value from the `enum` array."
}),
});
 
export const chat = hopfield.chat().functions([classifyMessage]);

TL;DR

Hopfield might be a good fit for your project if:

  • 🏗️ You build with Typescript/Javascript, and have your database schemas in these languages (e.g. Prisma and/or Next.js).
  • 🪨 You don't need a heavyweight LLM orchestration framework that ships with a ton of dependencies you'll never use.
  • 🤙 You're using OpenAI function calling and/or custom tools, and want Typescript-native features for them (e.g. validations w/ Zod).
  • 💬 You're building complex LLM interactions which use memory & RAG, evaluation, and orchestration (coming soon™).
  • 📝 You want best-practice, extensible templates, which use string literal types under the hood for transparency.

Oh, and liking Typescript is a nice-to-have.

Guiding principles

  • 🌀 We are Typescript-first, and only support TS (or JS) - with services like Replicate or OpenAI, why do you need Python?
  • 🤏 We provide a simple, ejectable interface with common LLM use-cases. This is aligned 1-1 with LLM provider abstractions, like OpenAI's.
  • 🪢 We explicitly don't provide a ton of custom tools (please don't ask for too many 😅) outside of the building blocks and simple examples provided. Other frameworks provide these, but when you use them, you soon realize the tool you want is very use-case specific.
  • 🧪 We (will) provide evaluation frameworks which let you simulate user scenarios and backend interactions with the LLM, including multi-turn conversations and function calling.
  • 🐶 We support Node.js, Vercel Edge Functions, Cloudflare Workers, and more (oh and even web, if you like giving away API keys).

Community

If you have questions or need help, reach out to the community in the Hopfield GitHub Discussions.

Chase Adams

Chase Adams

Creator

Learn more

Read the Getting Started guide to learn more how to use Hopfield.

Inspiration

Shoutout to these projects which inspired us:

If you like Hopfield, go star them on Github too.

Released under the MIT License.