Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Educhain refactored #40

Open
wants to merge 9 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 7 additions & 5 deletions cookbook/Convert_any_webpage_to_quiz.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
{
"cell_type": "markdown",
"source": [
"[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1i-K3mdWfyApCuUx7t_uSRTczZ-B4qpdI?usp=sharing)"
"[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/185Py9FXMLspaEqwmUAx0I1p-IeCa67zH?usp=sharing)"
],
"metadata": {
"id": "ScXuwxJl9uTo"
Expand All @@ -43,7 +43,7 @@
},
{
"cell_type": "code",
"execution_count": 1,
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
Expand Down Expand Up @@ -89,17 +89,19 @@
"metadata": {
"id": "e5R2skFs6wCq"
},
"execution_count": 2,
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"source": [
"from educhain import qna_engine\n",
"from educhain import Educhain\n",
"from langchain_openai import ChatOpenAI\n",
"\n",
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Consider making the model name configurable.

The code specifies a particular model ("gpt-4o-mini") for the ChatOpenAI instance. This might not be flexible for all users, and it's unclear if this is a publicly available model.

Consider the following improvements:

  1. Make the model name configurable, perhaps through an environment variable or a configuration file.
  2. Add a comment explaining the choice of this specific model and any requirements for using it.

Here's a suggested implementation:

import os

# Use an environment variable with a default value
model_name = os.getenv('OPENAI_MODEL_NAME', 'gpt-4o-mini')

# Add a comment explaining the model choice
# The gpt-4o-mini model is used for its balance of performance and efficiency.
# Ensure you have access to this model in your OpenAI account.
gpt_4o_mini = ChatOpenAI(model=model_name)

"gpt_4o_mini = ChatOpenAI(model = \"gpt-4o-mini\")\n",
"\n",
"client = Educhain()\n",
"\n",
Comment on lines +98 to +104
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Update qna_engine usage to client.

The import statement has been correctly updated to use Educhain instead of qna_engine, and a new Educhain client has been instantiated. However, the subsequent code still uses qna_engine, which will cause an error.

Please update the qna_engine usage to use the new client instance. Here's the suggested change:

 client = Educhain()

-url_mcqs = qna_engine.generate_mcqs_from_data(
+url_mcqs = client.generate_mcqs_from_data(
     source="https://en.wikipedia.org/wiki/Butterfly_effect",
     source_type="url",
     num=5,
     llm = gpt_4o_mini
 )

Committable suggestion was skipped due to low confidence.

"url_mcqs = qna_engine.generate_mcqs_from_data(\n",
" source=\"https://en.wikipedia.org/wiki/Butterfly_effect\",\n",
" source_type=\"url\",\n",
Expand All @@ -116,7 +118,7 @@
"id": "-GIvZQuO6zKH",
"outputId": "230790b6-50b9-4eb3-cb16-289024245d6a"
},
"execution_count": 4,
"execution_count": null,
"outputs": [
{
"output_type": "stream",
Expand Down
23 changes: 13 additions & 10 deletions cookbook/educhain_claude3_5_sonnet.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
{
"cell_type": "markdown",
"source": [
"[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1pVcI0MPqlww9O5NXwOWxMNFwkWsEhyON?usp=sharing)"
"[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1-t8Lvh10g3Qj_6bALuWIySqEHX9wWyjw?usp=sharing)"
],
"metadata": {
"id": "dNGLGan2h4I2"
Expand All @@ -22,7 +22,7 @@
},
{
"cell_type": "code",
"execution_count": 1,
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
Expand Down Expand Up @@ -60,7 +60,7 @@
},
{
"cell_type": "code",
"execution_count": 2,
"execution_count": null,
"metadata": {
"id": "EpFqH5_WXQq_"
},
Expand All @@ -74,7 +74,7 @@
},
{
"cell_type": "code",
"execution_count": 4,
"execution_count": null,
"metadata": {
"id": "2AESsAOd-xfF"
},
Expand All @@ -88,11 +88,14 @@
{
"cell_type": "code",
"source": [
"from educhain import qna_engine\n",
"from educhain import Educhain, LLMConfig\n",
"\n",
"sonnet_config = LLMConfig(custom_model=sonnet)\n",
"\n",
"questions = qna_engine.generate_mcq(topic = \"Functions - Calculus\",\n",
" num = 5,\n",
" llm = sonnet\n",
"client = Educhain(sonnet_config)\n",
"\n",
"questions = client.qna_engine.generate_questions(topic = \"Functions - Calculus\",\n",
" num = 5\n",
" )\n",
"\n",
"questions"
Expand All @@ -104,7 +107,7 @@
"id": "YVq1bTXviJLI",
"outputId": "d51bf077-2c92-44e4-83bf-298a0cf72ef3"
},
"execution_count": 6,
"execution_count": null,
"outputs": [
{
"output_type": "execute_result",
Expand All @@ -130,7 +133,7 @@
"id": "eRO5DcWlJaok",
"outputId": "2874fe5b-f870-488f-f678-e73a521b870b"
},
"execution_count": 7,
"execution_count": null,
"outputs": [
{
"output_type": "stream",
Expand Down
5 changes: 3 additions & 2 deletions docs/features/mcq_from_data.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,10 @@ Generate engaging MCQs from various data sources using AI! 🧠✨
## 🚀 Basic Usage

```python
from educhain import generate_mcqs_from_data
from educhain import Educhain

questions = generate_mcqs_from_data(
client = Educhain()
questions = client.qna_engine.generate_questions_from_data(
Comment on lines +8 to +11
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Update code snippet and align documentation with new API

The code snippet has been updated to use the new Educhain client, which is a good improvement. However, there are a few inconsistencies and areas for improvement:

  1. The method name generate_questions_from_data doesn't match the one mentioned in the AI summary (generate_mcqs_from_data). Please verify if this is the correct method name.

  2. The parameters learning_objective and difficulty_level are included in the method call. These should be explained in the "Customization Options" table.

  3. The rest of the documentation needs to be updated to reflect the new API usage. This includes:

    • Updating the "Customization Options" table to include the new parameters and remove any obsolete ones.
    • Revising the "Output Format" section if the return type has changed.
    • Checking if the "Pro Tips" and "Supported Data Sources" sections need any modifications.

To address these points, please:

  1. Confirm the correct method name and update it consistently throughout the document.
  2. Update the "Customization Options" table to include all current parameters:
| Option | Description | Example Values |
|--------|-------------|----------------|
| `source` | Data source for question generation | PDF file path, URL, or text content |
| `source_type` | Type of the data source | "pdf", "url", "text" |
| `num` | Number of questions to generate | 5, 10, 20 |
| `learning_objective` | Goal of the questions | "Understand AI basics", "Apply ML concepts" |
| `difficulty_level` | Difficulty of the questions | "Beginner", "Intermediate", "Advanced" |
| `llm` | Custom language model (optional) | ChatOpenAI(model="gpt-4") |
| `prompt_template` | Custom prompt template (optional) | "Generate questions about {topic}..." |
  1. Review and update the "Output Format" section if the return type has changed with the new API.

  2. Scan through the rest of the document to ensure all information is up-to-date with the new Educhain client usage.

source="https://en.wikipedia.org/wiki/Artificial_intelligence",
source_type="url",
num=5,
Expand Down
5 changes: 3 additions & 2 deletions docs/features/mcq_generation.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,10 @@ Unleash the power of AI to create engaging MCQs! 🧠✨
## 🚀 Basic Usage

```python
from educhain import qna_engine
from educhain import Educhain

questions = qna_engine.generate_mcq(
client = Educhain()
questions = client.qna_engine.generate_questions(
topic="Python Programming",
level="Intermediate",
num=10
Expand Down
40 changes: 19 additions & 21 deletions docs/getting-started/quick-start.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,10 +9,14 @@ Get up and running with Educhain in minutes! 🚀

Here's a simple example to generate multiple-choice questions:


```python
from educhain import qna_engine
`
from educhain import Educhain

client = Educhain()

questions = qna_engine.generate_mcq(
questions = client.qna_engine.generate_questions(
topic="Python Programming",
level="Beginner",
num=5
Expand All @@ -30,7 +34,9 @@ for i, q in enumerate(questions, 1):
Customize your questions with additional parameters:

```python
questions = qna_engine.generate_mcq(
client = Educhain()

questions = client.qna_engine.generate_questions(
topic="Machine Learning",
level="Intermediate",
num=3,
Expand All @@ -44,9 +50,9 @@ questions = qna_engine.generate_mcq(
Create comprehensive lesson plans with ease:

```python
from educhain import content_engine

lesson_plan = content_engine.generate_lesson_plan(
from educhain import Educhain()
client = Educhain()
lesson_plan = client.content_engine.generate_lesson_plan(
topic="World War II",
grade_level="High School",
duration="60 minutes"
Expand Down Expand Up @@ -92,24 +98,16 @@ export EDUCHAIN_API_KEY="your-api-key-here"
Choose your preferred language model:

```python
from educhain import qna_engine
from langchain_google_genai import ChatGoogleGenerativeAI
from educhain import Educhain, LLMConfig

qna_engine.set_model("gpt-4") # Default is "gpt-3.5-turbo"
```

## 🎨 Customizing Prompt Templates

Define your own prompt templates:

```python
from educhain import qna_engine
gemini_flash = ChatGoogleGenerativeAI(
model="gemini-1.5-flash-exp-0827",
google_api_key="GOOGLE_API_KEY")

custom_template = """
Generate {num} multiple-choice questions about {topic} at {level} level.
Each question should have 4 options and one correct answer.
"""
flash_config = LLMConfig(custom_model=gemini_flash)

qna_engine.set_prompt_template(custom_template)
client = Educhain(flash_config)
```


Expand Down
15 changes: 4 additions & 11 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,9 +31,11 @@ Educhain consistently outperforms traditional methods in content generation spee
## 🚀 Get Started in Minutes

```python
from educhain import qna_engine
from educhain import Educhain

questions = qna_engine.generate_mcq(
client = Educhain()

questions = client.qna_engine.generate_questions(
topic="Indian History",
level="Beginner",
num=5
Expand All @@ -47,15 +49,6 @@ print(questions)

Educators worldwide are using Educhain to transform their teaching. Check out our [success stories](resources/case-studies.md) to see how Educhain is making a difference in classrooms around the globe.

## 🗺️ Roadmap

We're constantly improving Educhain. Here's what's on the horizon:

- [ ] Finetuned Model for question generation
- [ ] Integration with popular Learning Management Systems
- [ ] Mobile app for on-the-go content generation

[📅 View our full roadmap](https://github.com/educhain/educhain/projects/1)

## 🤝 Contributing

Expand Down