Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Llmify #549

Merged
merged 13 commits into from
May 23, 2023
11 changes: 11 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,17 @@ As of v0.2-alpha, this project is attempting to adhere to [Semantic Versioning](
While alpha, however, any version may include breaking changes that may not be specifically noted as such,
and breaking changes will not necessarily result in changes to the main version number.

## [v1.6.15-alpha](https://github.com/Lexpedite/blawx/releases/tag/v1.6.15-alpha) 2023-05-23

### Added
* If you provide an API access key for an OpenAI account, Blawx will use ChatGPT-3.5 to summarize its explanations and display those summaries in scenario editor.

### Changed
* Disclaimer has been added to the GCWeb styled version of the scenario editor.

### TODO
* Update the documentation for the scenario editor.

## [v1.6.14-alpha](https://github.com/Lexpedite/blawx/releases/tag/v1.6.14-alpha) 2023-05-12

### Added
Expand Down
13 changes: 13 additions & 0 deletions INSTALL.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,9 @@ cd blawx
./update.sh
```

Note that the `./update.sh` script runs the blawx server in the terminal, for development purposes, so that you can see debug information.
If you want to run the docker container in the background, add `-d` as a flag to the `docker run` command in that script.

This command will take several minutes to run the first time.

The Blawx server will now be available at [http://127.0.0.1:8000](http://127.0.0.1:8000),
Expand All @@ -36,6 +39,16 @@ of running the `./update.sh` script.
A demo account with username "demo" and password "blawx2022" is also created,
and should be deleted in the admin interface if you want to restrict access to your server.

## Configure ChatGPT Integration

If you wish to run Blawx with ChatGPT integration, which allows for AI-generated summaries of explanations to be displayed
to the user in the scenario editor, you will need to not use the `./update.sh` command, and instead enter these two commands:

```
docker build -t blawx .
docker run -it -p 8000:8000 -e OPENAI_API_KEY="your_key_goes_here" blawx
```

## Updating Blawx

Blawx is under active development. Currently, updates are being sent to GitHub only, there is no published
Expand Down
7 changes: 7 additions & 0 deletions blawx/fixtures/docs/components/scenario_editor.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -110,6 +110,13 @@
attributes, all those contingent answers will be included, also. You can see which answers are contingent
by looking to see whether there are parts of the explanations that indicate the reason was assumed.

### ChatGPT-Generated Summaries

If you followed the instructions for providing Blawx with your OpenAI API Key, Scenario Editor will attempt
to obtain AI-generated summaries of the details inside each explanation for an answer, and display that summary
at the top of the explanation and provide the standard tree-structured explanation in a collapsable area beneath
the summary. The summary will be prefaced with a warning that it was generated by a generative AI.

## View

The View tab of the scenario editor gives you the ability to customize the Facts tab by hiding various elements
Expand Down
8 changes: 8 additions & 0 deletions blawx/fixtures/docs/features/answers.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -108,6 +108,14 @@
It is possible that age is a factor that can exclude you, but cannot include you. So you
would not really be getting the answer to your question unless you ran both queries.

## ChatGPT Summaries of Explanations

If you provide Blawx with an OpenAI API Key when running the server (see `INSTALL.md` for details)
in scenario editor your tree-structured explanations will be prefaced with an AI-generated plain-
language summary. It is prefaced with a warning that it should not be relied upon for understanding
how the reasoner reached the conclusion, and the actual tree-structured explanation on which it is
based is still made available.




Expand Down
3 changes: 2 additions & 1 deletion blawx/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,5 @@ pyyaml
cobalt
clean-law >=0.0.4
django-guardian
django-preferences
django-preferences
openai
2 changes: 1 addition & 1 deletion blawx/settings.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
from pathlib import Path

# For adding a version identifier
BLAWX_VERSION = "v1.6.14-alpha"
BLAWX_VERSION = "v1.6.15-alpha"


# Build paths inside the project like this: BASE_DIR / 'subdir'.
Expand Down
27 changes: 27 additions & 0 deletions blawx/simplifier.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
from django.http import Http404, HttpResponseNotFound, HttpResponseForbidden

from rest_framework.decorators import api_view, permission_classes, authentication_classes
from rest_framework.response import Response
# from rest_framework.permissions import AllowAny
from rest_framework.authentication import SessionAuthentication, BasicAuthentication
from rest_framework.permissions import IsAuthenticated, DjangoObjectPermissions, IsAuthenticatedOrReadOnly, AllowAny

import openai
import os

prompt_preamble = """
What follows is an automatically generated explanation. Restate it in plain language without restating mathematical calculations and
without further justifying conclusions for which there is only an absence of evidence in support.


"""

@api_view(['POST'])
@authentication_classes([SessionAuthentication])
@permission_classes([IsAuthenticated])
def simplify(request):
if "OPENAI_API_KEY" in os.environ:
completion = openai.ChatCompletion.create(model="gpt-3.5-turbo", messages=[{"role": "user", "content": prompt_preamble + request.data['explanation'] }])
return Response(completion.choices[0].message.content)
else:
return Response("")
68 changes: 67 additions & 1 deletion blawx/templates/blawx/scenario_editor.html
Original file line number Diff line number Diff line change
Expand Up @@ -1510,14 +1510,25 @@ <h6>Response</h6>
var model_count = j + 1;
var model_heading_name = "answer_" + count + "_model_" + model_count + "_heading";
var model_collapse_name = "answer_" + count + "_model_" + model_count + "_collapse";


output_content += '<div class="accordion-item"><h2 class="accordion-header" id="' + model_heading_name + '">';
output_content += '<button class="accordion-button collapsed" type="button" data-bs-toggle="collapse" data-bs-target="#' + model_collapse_name + '" aria-expanded="false" aria-controls="' + model_collapse_name + '">';
output_content += 'Explanation #' + model_count;
output_content += '</button></h2>';
output_content += '<div id="' + model_collapse_name + '" class="accordion-collapse collapse" aria-labelledby="' + model_heading_name + '" style="">';
//for (var attribute in models[j]['Residuals']) {
output_content += '<div id="' + model_collapse_name + '_simplified">';
output_content += '</div>';
//for (var attribute in models[j]['Residuals']) {
// attributes_output.push(describe_constraint(models[j]['Residuals'][attribute]));
//}

output_content += '<div id="' + model_collapse_name + '_detail_header" class="accordian-item"><h3 class="accordian-header">';
output_content += '<button class="accordion-button collapsed" type="button" data-bs-toggle="collapse" data-bs-target="#' + model_collapse_name + '_detail_content" aria-expanded="false" aria-controls="' + model_collapse_name + '_detail_content">';
output_content += 'Details';
output_content += '</button></h3>';
output_content += '<div id="' + model_collapse_name + '_detail_content" class="accordian-collapse collapse aria-labelledby="' + model_collapse_name + '_detail_header">';

var constraints_output = describe_constraints_new(models[j].Residuals, models[j].Terms);


Expand Down Expand Up @@ -1561,13 +1572,55 @@ <h6>Response</h6>
//output_content += getNodesFromModel(models[j].Raw);
// output_content += convertModelToParagraphs(models[j].Raw);
output_content += '</div></div>';
output_content += '</div></div>';

}
output_content += '</div></div></div>';
}
output_content += '</div>';
answer_element.innerHTML = output_content;

// Now we summarize them
for (var a = 0; a < parsed_test_response['Answers'].length; a++) {
for (var e = 0; e < parsed_test_response['Answers'][a]['Models'].length; e++) {
var target_explanation = document.getElementById("answer_" + (a+1) + "_model_" + (e+1) + "_collapse");
var target_summary = document.getElementById("answer_" + (a+1) + "_model_" + (e+1) + "_collapse_simplified");
var detail_header_name = "answer_" + (a+1) + "_model_" + (e+1) + "_collapse_detail_header";
var detail_content_name = "answer_" + (a+1) + "_model_" + (e+1) + "_collapse_detail_content";
var text_explanation = get_text_of_explanation(target_explanation);
var simplify_request = new XMLHttpRequest();
var warning_text = '<div class="alert alert-warning d-flex align-items-center alert-dismissible fade show" role="alert" id="answer_"' + (a+1) + "_model_" + (e+1) + '_collapse_simplified_warning">';
warning_text += '<i class="bi bi-exclamation-triangle-fill" aria-label="Warning:"></i>';
warning_text += '<div>The following summarization was automatically generated, and may not be accurate. The details below show the actual reasoning.<button type="button" class="btn-close" data-bs-dismiss="alert" aria-label="Close"></button></div>'
warning_text += '</div>';
simplify_request.onload = function () {
if (this.responseText != '""') {
target_summary.innerHTML = warning_text + "Summary: " + this.responseText;
} else {
// There was no answer, so there is no summary. The contents of the details should be moved to the explanation part,
// and the details parts removed.

var detail_header_target = document.getElementById(detail_header_name);
var detail_content_target = document.getElementById(detail_content_name);

// Get the details content innerHTML
var explanation = detail_content_target.innerHTML;
// Set it to the value of the target.
target_summary.innerHTML = explanation;
// remove the header and the content elements.
detail_header_target.remove();
detail_content_target.remove();
}
}
simplify_request.open("POST", "{% url 'simplify' %}");
simplify_request.setRequestHeader("Content-Type", "application/json");
target_summary.innerHTML = "Getting AI Summary...";
console.log("Sending simplify request");
simplify_request.setRequestHeader('X-CSRFToken', csrftoken);
simplify_request.send(JSON.stringify({"explanation": text_explanation}));
}
}

$('#nav-answers-tab').tab('show');
draw_facts(); // So that new relevance information will be displayed in the interface.
} else {
Expand All @@ -1588,6 +1641,19 @@ <h6>Response</h6>
testrun_request.setRequestHeader('X-CSRFToken', csrftoken);
testrun_request.send(JSON.stringify(new_fact_data));
}

function get_text_of_explanation(element) {
var output = ""
if (element.hasChildNodes()) {
for (var c = 0; c < element.childNodes.length; c++) {
output += get_text_of_explanation(element.childNodes[c])
}
} else {
output += "\n" + element.data
}
return output;
}

var view_form_element = document.getElementById('viewform');
function toggle_view_hidden(input) {
index = hidden_by_view.indexOf(input);
Expand Down
Loading