Typescript for OpenAI
Getting started with an OpenAI using Typescript using yarn for package management and nvm for managing node versions.
Set up the environment
nvm install lts/*
npm install -g strip-json-comments-cli
npm install -g yarn
Set up the project directory. I use ~/src/<project>
mkdir <applicationname>
node --version > .nvmrc
yarn init -y
Example package.json for the chatone application.
{
"name": "chatone",
"version": "0.1.0",
"main": "index.js",
"author": "Mark C Allen <mark@markcallen.com>",
"license": "MIT"
}
Add typescript
yarn add -D typescript @types/node tsx rimraf dotenv
Create the tsconfig.json file
npx tsc --init --rootDir src --outDir dist \
--esModuleInterop --resolveJsonModule --lib es2021 --target es2021 \
--module nodenext --allowJs true --noImplicitAny true
cat tsconfig.json | strip-json-comments --no-whitespace | jq -r . > tsconfig.pretty.json && mv tsconfig.pretty.json tsconfig.json
tsconfig.json
{
"compilerOptions": {
"target": "es2021",
"lib": [
"es2021"
],
"module": "nodenext",
"rootDir": "src",
"resolveJsonModule": true,
"allowJs": true,
"outDir": "dist",
"esModuleInterop": true,
"forceConsistentCasingInFileNames": true,
"strict": true,
"noImplicitAny": true,
"skipLibCheck": true
}
}
Get an OpenAI key add it to your .env file
# OpenAI API key for LangChain GPT-4 agent
OPENAI_API_KEY=sk-proj...
Add openai and langchain/core
yarn add openai @langchain/core
Create a simple chat bot in the src/ directory.
mkdir src
import 'dotenv/config';
import { ChatOpenAI } from '@langchain/openai';
import readline from 'readline/promises';
import { stdin as input, stdout as output } from 'node:process';
const rl = readline.createInterface({ input, output });
const model = new ChatOpenAI({
temperature: 0.7,
openAIApiKey: process.env.OPENAI_API_KEY,
});
async function ask(question: string) {
const res = await model.invoke(question);
return res.content;
}
async function main() {
console.log('🚀 LLM chat started. Type your questions below. Ctrl+C to exit.\n');
while (true) {
const input = await rl.question('❓ You: ');
const response = await ask(input);
console.log(`🧠 LLM: ${response}\n`);
}
}
main();
src/index.ts
Add scripts to the package.json
npm pkg set "scripts.build"="rimraf ./dist && tsc"
Now build and run
yarn build
node dist/index.js
🚀 LLM chat started. Type your questions below. Ctrl+C to exit.
❓ You: who are you?
🧠 LLM: I am a language model AI created to assist with various tasks and provide information. How can I help you today?
❓ You:
To run without having to explicitly compile the typescript
npm pkg set "scripts.dev"="tsx src/index.ts"
Now run in dev mode
yarn dev
yarn run v1.22.22
$ tsx src/index.ts
🚀 LLM chat started. Type your questions below. Ctrl+C to exit.
❓ You:
Add everything to git.
git init
cat << EOF > .gitignore
.env
yarn-error.log
dist/
node_modules/
EOF
git add .
git commit -m "First checkin" -a
There we go, a working typescript chatbot that uses Openai.
Next steps are to add a Dockerfile for Typescript