Skip to content

Latest commit

 

History

History
47 lines (30 loc) · 1.89 KB

README.md

File metadata and controls

47 lines (30 loc) · 1.89 KB

Project Readme

This repository contains code to integrate with the OpenAI API using the LangChain library. The code creates a simple demo application using Streamlit to interact with the OpenAI API for natural language processing.

Getting Started

To run this code, you will need Python installed on your machine. You will also need to install the required dependencies listed in the requirements.txt file.

pip install -r requirements.txt

Additionally, you need to set up your environment variables. Make sure to create a .env file in the root directory of the project and add the following variables:

LANGCHAIN_ENDPOINT=https://api.smith.langchain.com
LANGCHAIN_TRACING_V2=true
LANGCHAIN_PROJECT=test
LANGCHAIN_API_KEY=<your_api_key>

Replace <your_api_key> with your actual API key for the OpenAI API.

Usage

After setting up the environment variables, you can run the code using Streamlit:

streamlit run main.py

This will start a local server, and you can access the demo application in your web browser.

Code Overview

The code consists of the following components:

  • main.py: This is the main Python script that contains the Streamlit application code. It prompts the user to enter a question and then invokes the OpenAI API to get a response.
  • langchain_community: This module provides the Ollama class, which is used to interact with the LangChain library and the OpenAI API.
  • langchain_core: This module contains utility functions and classes for working with LangChain.
  • langsmith: This module provides wrappers and utilities for natural language processing tasks.
  • .env: This file contains environment variables required for the application to run, including the OpenAI API key.

Contributing

Contributions are welcome! If you find any issues or have suggestions for improvements, please feel free to open an issue or submit a pull request.