ModelFusion
Build AI applications, chatbots, and agents with JavaScript and TypeScript.
README
ModelFusion
Build AI applications, chatbots, and agents with JavaScript and TypeScript.
[!NOTE]
ModelFusion is in its initial development phase. Until version 1.0 there may be breaking changes, because I am still exploring the API design. Feedback and suggestions are welcome.
Introduction
Quick Install
- ```sh
- npm install modelfusion
- ```
- ```sh
- npm install zod zod-to-json-schema
- ```
Usage Examples
generateText
- ```ts
- const text = await generateText(
- new OpenAITextGenerationModel({ model: "text-davinci-003" }),
- "Write a short story about a robot learning to love:\n\n"
- );
- ```
streamText
- ```ts
- const textStream = await streamText(
- new OpenAIChatModel({ model: "gpt-3.5-turbo", maxTokens: 1000 }),
- [
- OpenAIChatMessage.system("You are a story writer."),
- OpenAIChatMessage.user("Write a story about a robot learning to love"),
- ]
- );
- for await (const textFragment of textStream) {
- process.stdout.write(textFragment);
- }
- ```
Prompt Mapping
- ```ts
- const text = await generateText(
- new LlamaCppTextGenerationModel({
- contextWindowSize: 4096, // Llama 2 context window size
- nPredict: 1000,
- }).mapPrompt(InstructionToLlama2PromptMapping()),
- {
- system: "You are a story writer.",
- instruction: "Write a short story about a robot learning to love.",
- }
- );
- ```
- ```ts
- const textStream = await streamText(
- new OpenAIChatModel({
- model: "gpt-3.5-turbo",
- }).mapPrompt(ChatToOpenAIChatPromptMapping()),
- [
- { system: "You are a celebrated poet." },
- { user: "Write a short story about a robot learning to love." },
- { ai: "Once upon a time, there was a robot who learned to love." },
- { user: "That's a great start!" },
- ]
- );
- ```
Metadata and original responses
- ```ts
- // access the full response and the metadata:
- // the response type is specific to the model that's being used
- const { response, metadata } = await generateText(
- new OpenAITextGenerationModel({
- model: "text-davinci-003",
- maxTokens: 1000,
- n: 2, // generate 2 completions
- }),
- "Write a short story about a robot learning to love:\n\n",
- { fullResponse: true }
- );
- for (const choice of response.choices) {
- console.log(choice.text);
- }
- console.log(`Duration: ${metadata.durationInMs}ms`);
- ```
- ```ts
- const value = await generateJson(
- new OpenAIChatModel({
- model: "gpt-3.5-turbo",
- temperature: 0,
- maxTokens: 50,
- }),
- {
- name: "sentiment" as const,
- description: "Write the sentiment analysis",
- schema: z.object({
- sentiment: z
- .enum(["positive", "neutral", "negative"])
- .describe("Sentiment."),
- }),
- },
- OpenAIChatFunctionPrompt.forSchemaCurried([
- OpenAIChatMessage.system(
- "You are a sentiment evaluator. " +
- "Analyze the sentiment of the following product review:"
- ),
- OpenAIChatMessage.user(
- "After I opened the package, I was met by a very unpleasant smell " +
- "that did not disappear even after washing. Never again!"
- ),
- ])
- );
- ```
- ```ts
- const { schema, value, text } = await generateJsonOrText(
- new OpenAIChatModel({ model: "gpt-3.5-turbo", maxTokens: 1000 }),
- [
- {
- name: "getCurrentWeather" as const, // mark 'as const' for type inference
- description: "Get the current weather in a given location",
- schema: z.object({
- location: z
- .string()
- .describe("The city and state, e.g. San Francisco, CA"),
- unit: z.enum(["celsius", "fahrenheit"]).optional(),
- }),
- },
- {
- name: "getContactInformation" as const,
- description: "Get the contact information for a given person",
- schema: z.object({
- name: z.string().describe("The name of the person"),
- }),
- },
- ],
- OpenAIChatFunctionPrompt.forSchemasCurried([OpenAIChatMessage.user(query)])
- );
- ```
Create Tool
- ```ts
- const calculator = new Tool({
- name: "calculator" as const, // mark 'as const' for type inference
- description: "Execute a calculation",
- inputSchema: z.object({
- a: z.number().describe("The first number."),
- b: z.number().describe("The second number."),
- operator: z.enum(["+", "-", "*", "/"]).describe("The operator."),
- }),
- execute: async ({ a, b, operator }) => {
- switch (operator) {
- case "+":
- return a + b;
- case "-":
- return a - b;
- case "*":
- return a * b;
- case "/":
- return a / b;
- default:
- throw new Error(`Unknown operator: ${operator}`);
- }
- },
- });
- ```
useTool
- ```ts
- const { tool, parameters, result } = await useTool(
- new OpenAIChatModel({ model: "gpt-3.5-turbo" }),
- calculator,
- OpenAIChatFunctionPrompt.forToolCurried([
- OpenAIChatMessage.user("What's fourteen times twelve?"),
- ])
- );
- ```
useToolOrGenerateText
- ```ts
- const { tool, parameters, result, text } = await useToolOrGenerateText(
- new OpenAIChatModel({ model: "gpt-3.5-turbo" }),
- [calculator /* ... */],
- OpenAIChatFunctionPrompt.forToolsCurried([
- OpenAIChatMessage.user("What's fourteen times twelve?"),
- ])
- );
- ```
- ```ts
- const transcription = await transcribe(
- new OpenAITranscriptionModel({ model: "whisper-1" }),
- {
- type: "mp3",
- data: await fs.promises.readFile("data/test.mp3"),
- }
- );
- ```
- ```ts
- const image = await generateImage(
- new OpenAIImageGenerationModel({ size: "512x512" }),
- "the wicked witch of the west in the style of early 19th century painting"
- );
- ```
- ```ts
- const embeddings = await embedTexts(
- new OpenAITextEmbeddingModel({ model: "text-embedding-ada-002" }),
- [
- "At first, Nox didn't know what to do with the pup.",
- "He keenly observed and absorbed everything around him, from the birds in the sky to the trees in the forest.",
- ]
- );
- ```
- ```ts
- const tokenizer = new TikTokenTokenizer({ model: "gpt-4" });
- const text = "At first, Nox didn't know what to do with the pup.";
- const tokenCount = await countTokens(tokenizer, text);
- const tokens = await tokenizer.tokenize(text);
- const tokensAndTokenTexts = await tokenizer.tokenizeWithTexts(text);
- const reconstructedText = await tokenizer.detokenize(tokens);
- ```
- ```ts
- const texts = [
- "A rainbow is an optical phenomenon that can occur under certain meteorological conditions.",
- "It is caused by refraction, internal reflection and dispersion of light in water droplets resulting in a continuous spectrum of light appearing in the sky.",
- // ...
- ];
- const vectorIndex = new MemoryVectorIndex<TextChunk>();
- const embeddingModel = new OpenAITextEmbeddingModel({
- model: "text-embedding-ada-002",
- });
- // update an index - usually done as part of an ingestion process:
- await upsertTextChunks({
- vectorIndex,
- embeddingModel,
- chunks: texts.map((text) => ({ text })),
- });
- // retrieve text chunks from the vector index - usually done at query time:
- const { chunks } = await retrieveTextChunks(
- new SimilarTextChunksFromVectorIndexRetriever({
- vectorIndex,
- embeddingModel,
- maxResults: 3,
- similarityThreshold: 0.8,
- }),
- "rainbow and water droplets"
- );
- ```
Features
Integrations
Vector Indices
Observability
Prompt Formats
Prompt | Instruction | Chat |
---|---|---|
------------- | ------------------ | ----------- |
OpenAI | ✅ | ✅ |
Llama | ✅ | ✅ |
Alpaca | ✅ | ❌ |
Vicuna | ❌ | ✅ |
Generic | ✅ | ✅ |
Documentation
More Examples
_Terminal app_, _chat_, _llama.cpp_
_Next.js app_, _OpenAI GPT-3.5-turbo_, _streaming_, _abort handling_
_terminal app_, _PDF parsing_, _in memory vector indices_, _retrieval augmented generation_, _hypothetical document embedding_
_Next.js app_, _Stability AI image generation_
_Next.js app_, _OpenAI Whisper_
_terminal app_, _agent_, _BabyAGI_
_terminal app_, _agent_, _tools_, _GPT-4_
_terminal app_, _PDF parsing_, _recursive information extraction_, _in memory vector index, \_style example retrieval_, _OpenAI GPT-4_, _cost calculation_