Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
kachiever
Participant
Welcome to the Fourth Episode of the Series: Hello World OpenAI: Crafting Accurate ChatGPT-Like Custom Search for SAPUI5 Application. Till now we installed IDE, set up the environment, created a Secret Key for our OpenAI Account, tested the same using Postman, and created a NodeJs-based API Service BTP-CF. So, in this episode, we will use our OpenAI Secret key and create a NodeJS API Service that will help us to create our knowledge base and insert the training data. And, finally use the same Search Domain whenever we do any Search operations.


You can check all the existing & upcoming blog posts of this series via the introduction blog post available below link-

Click here

Prerequisites



  • I am assuming that you have a Trial Account Setup for OpenAI and have created your Secret Key in the previous episode.

  • You have NodeJS & VS Code installed as shared in an earlier episode.

  • You have created the NodeJS-based API Service shared in the last episode[Link].


 

What we are Trying to Do?


OpenAI


OpenAI is an organization that develops advanced language models like GPT-3.5. It aims to create powerful and versatile AI systems for various applications, enhancing human-machine interaction and problem-solving. That was the bookish definition, but as a developer, it's an org that has made AI accessible to all. We can you GPT for a chat experience or use advanced AI/ML capabilities exposed as APIs & libs provided for programming languages.

What do we want?


We started with the scenario where we wanted to create a custom Search Domain or Kind of Model or Knowledgebase. And when we give input it should be able to search and give us ChatGPT-like accurate response from our Model. Can be used for custom search or Categorization.

How will we do it?


So OpenAI provides us an API Service called Embeddings. It provides us a vector representation of a given input that can be easily consumed by machine learning models and algorithms. In short, it converts your input to an AI/ML Model consumable form. So basically we will convert the required Inputs  / Text / Phrase / Categories into a Vector Form. Whenever we get any input, we will convert it into a Vector, compare it only with our stored Vectors, and show the top 3 matching ones with their Match Confidence Score.

Time to Code


Add Functionality


We will first code the Add Functionality. We will provide an endpoint where users can add custom Text / Phrase. Our code will convert into a Vector and store it in a JSON File stored on the server side.

In our last episode, we added this endpoint. We will start writing our code below this.

Snippet 1
app.get('/', (req, res) => {
res.send('<p>Welcome to OPEN API Gateway</p><p>Service is Up & Running </p><p></p>' + Date());
})

And remember the below piece of code added in the previous episode should come last.

Snippet 2
app.listen(port, () => {
console.log(`Open AI App is listening on port ${port}`)
})

So, Copy and paste the below code between Snippet 1 & Snippet 2.
// Endpoint to add more data to the JSON file
app.post('/add', (req, res) => {
try {
//Fetch Vector
var newText = req.body.entry;
const session_url = 'https://api.openai.com/v1/embeddings';
var config_post = {
method: 'post',
url: session_url,
headers: {
'Authorization': 'Bearer '+token ,
'Content-Type' : 'application/json'
},
data: {
"model": "text-embedding-ada-002",
"input": newText
}
};
axios(config_post)
.then(function (response) {
const jsondata = readDataFromFile();
ques_vector = response.data["data"][0].embedding;
var len = jsondata.length;
//Format Data for Pushing
var id;
console.log(len);
if(len>0){
id = jsondata[len - 1].id + 1;
}else{
id = 0;
}
var entry = {
id : id,
text : newText,
embedding : ques_vector
}
//Push data to JSON
const existingData = readDataFromFile();
existingData.push(entry);
saveDataToFile(existingData);
res.status(201).json({ message: 'Data added successfully.' });
//End of Push Data
})
.catch(function (error) {
//Error
console.log(error);
res.send(error);
});
} catch (err) {
res.status(500).json({ error: 'Failed to add data.' });
}
});

Also, add the helper function to read data.
// Helper function to read data from the JSON file
function readDataFromFile() {
try {
const data = fs.readFileSync(dataFilePath, 'utf8');
return JSON.parse(data);
} catch (err) {
// If the file doesn't exist or is empty, return an empty array
return [];
}
}

And a helper function for storing data.
// Helper function to save data to the JSON file
function saveDataToFile(data) {
fs.writeFileSync(dataFilePath, JSON.stringify(data, null, 2), 'utf8');
}

 

In this code, we are using our Token / OpenAI Secret Key to Consume OpenAI Embedding Service, to fetch Vector using the "text-embedding-ada-002" Model, and finally store it in our Local JSON file.

Read Functionality


Once added, users may like to view what's in the Knowledge base, so this endpoint will show all the Stored Data.

So, Copy and paste the below code snippet.
// Endpoint to display the content of the JSON file
app.get('/display', (req, res) => {
try {
const data = readDataFromFile();
res.status(200).json(data);
} catch (err) {
res.status(500).json({ error: 'Failed to retrieve data.' });
}
});

Delete Functionality


Once added and viewed, users may like to delete things in the Knowledge base, so this endpoint will help in the deletion of the Stored records.

So, Copy and paste the below code snippet.
// Endpoint to delete content from the JSON file
app.delete('/delete/:index', (req, res) => {
try {
const indexToDelete = parseInt(req.params.index);
const existingData = readDataFromFile();

if (indexToDelete < 0 || indexToDelete >= existingData.length) {
return res.status(400).json({ error: 'Invalid index.' });
}

existingData.splice(indexToDelete, 1);
saveDataToFile(existingData);
res.status(200).json({ message: 'Data deleted successfully.' });
} catch (err) {
res.status(500).json({ error: 'Failed to delete data.' });
}
});

Ask AI


Now the final functionality for which we did everything. Endpoint is where we send input and Enpoint returns us the Top 3 matching text/phrase/categories Stored in our custom search Domain / Model / Knowledge Base.
// Call the Open AI
app.post('/askai', function (req, res) {
const data = readDataFromFile();
//var data = require('./data');
var ques = req.body.ques;
let table = [];
let tabrow = [];
let top3 = [];
//Fetch Vector
const session_url = 'https://api.openai.com/v1/embeddings';
var config_post = {
method: 'post',
url: session_url,
headers: {
'Authorization': 'Bearer '+token ,
'Content-Type' : 'application/json'
},
data: {
"model": "text-embedding-ada-002",
"input": ques
}
};
//Call
axios(config_post)
.then(function (response) {
ques_vector = response.data["data"][0].embedding;
var len = data.length;
//Check Similarities
for (let i = 0; i < len; i++) {
tabrow = [data[i].text, similarity(ques_vector, data[i].embedding)];
table.push(tabrow);
}
//Sort
table = table.sort((a, b) => a[1] - b[1]);
console.log("Question : "+ques);
console.log("Similarity Check");
//Sort Descending
console.log(table.reverse());
console.log("Top 3 Matches");
console.log(table[0]);
console.log(table[1]);
console.log(table[2]);
//Top 3
top3.push(table[0]);
top3.push(table[1]);
top3.push(table[2]);
res.send(top3);

})
.catch(function (error) {
//Error
console.log(error);
});
//
})

What we are doing is that we already have the Vector Maps of the Text/Phrases stored in our Knowledge Base. When a new Text is received as input, we convert the Text/phrase into a vector via the Embeddings service of OpenAI. Then we use the Node Library compute-cosine-similarity to compare the Vectors, get the Score & finally send back the Top 3 results as a response.

Final Code


So, we are all set, We can move both helper functions at the last of the code to make it more organized/readable. If you have followed all the steps correctly, your code should look like this:
const express = require('express');
const bodyParser = require('body-parser');
const fs = require('fs');
const app = express();
var similarity = require( 'compute-cosine-similarity' );
const basicAuth = require('express-basic-auth');
var cors = require('cors');
const axios = require('axios');
const PORT = 8080;
const token = 'sk--[Your OpenAI Secret Key]--';

// Data File
const dataFilePath = 'data.json';

// Middleware to parse incoming JSON data & Disable CORS
app.use(bodyParser.json());
app.use(cors())

//Adding Auth
app.use(basicAuth({
users: { 'admin': 'oursecret' }
}))

// Welcome Page
app.get('/', (req, res) => {
res.send('<p>Welcome to OPEN API Gateway</p><p>Service is Up & Running </p><p></p>' + Date());
})

// Endpoint to add more data to the JSON file
app.post('/add', (req, res) => {
try {
//Fetch Vector
var newText = req.body.entry;
const session_url = 'https://api.openai.com/v1/embeddings';
var config_post = {
method: 'post',
url: session_url,
headers: {
'Authorization': 'Bearer '+token ,
'Content-Type' : 'application/json'
},
data: {
"model": "text-embedding-ada-002",
"input": newText
}
};
axios(config_post)
.then(function (response) {
const jsondata = readDataFromFile();
ques_vector = response.data["data"][0].embedding;
var len = jsondata.length;
//Format Data for Pushing
var id;
console.log(len);
if(len>0){
id = jsondata[len - 1].id + 1;
}else{
id = 0;
}
var entry = {
id : id,
text : newText,
embedding : ques_vector
}
//Push data to JSON
const existingData = readDataFromFile();
existingData.push(entry);
saveDataToFile(existingData);
res.status(201).json({ message: 'Data added successfully.' });
//End of Push Data
})
.catch(function (error) {
//Error
console.log(error);
res.send(error);
});
} catch (err) {
res.status(500).json({ error: 'Failed to add data.' });
}
});

// Endpoint to display the content of the JSON file
app.get('/display', (req, res) => {
try {
const data = readDataFromFile();
res.status(200).json(data);
} catch (err) {
res.status(500).json({ error: 'Failed to retrieve data.' });
}
});

// Endpoint to delete content from the JSON file
app.delete('/delete/:index', (req, res) => {
try {
const indexToDelete = parseInt(req.params.index);
const existingData = readDataFromFile();

if (indexToDelete < 0 || indexToDelete >= existingData.length) {
return res.status(400).json({ error: 'Invalid index.' });
}

existingData.splice(indexToDelete, 1);
saveDataToFile(existingData);
res.status(200).json({ message: 'Data deleted successfully.' });
} catch (err) {
res.status(500).json({ error: 'Failed to delete data.' });
}
});

// Call the Open AI
app.post('/askai', function (req, res) {
const data = readDataFromFile();
//var data = require('./data');
var ques = req.body.ques;
let table = [];
let tabrow = [];
let top3 = [];
//Fetch Vector
const session_url = 'https://api.openai.com/v1/embeddings';
var config_post = {
method: 'post',
url: session_url,
headers: {
'Authorization': 'Bearer '+token ,
'Content-Type' : 'application/json'
},
data: {
"model": "text-embedding-ada-002",
"input": ques
}
};
//Call
axios(config_post)
.then(function (response) {
ques_vector = response.data["data"][0].embedding;
var len = data.length;
//Check Similarities
for (let i = 0; i < len; i++) {
tabrow = [data[i].text, similarity(ques_vector, data[i].embedding)];
table.push(tabrow);
}
//Sort
table = table.sort((a, b) => a[1] - b[1]);
console.log("Question : "+ques);
console.log("Similarity Check");
//Sort Descending
console.log(table.reverse());
console.log("Top 3 Matches");
console.log(table[0]);
console.log(table[1]);
console.log(table[2]);
//Top 3
top3.push(table[0]);
top3.push(table[1]);
top3.push(table[2]);
res.send(top3);

})
.catch(function (error) {
//Error
console.log(error);
});
//
})

// Helper function to read data from the JSON file
function readDataFromFile() {
try {
const data = fs.readFileSync(dataFilePath, 'utf8');
return JSON.parse(data);
} catch (err) {
// If the file doesn't exist or is empty, return an empty array
return [];
}
}

// Helper function to save data to the JSON file
function saveDataToFile(data) {
fs.writeFileSync(dataFilePath, JSON.stringify(data, null, 2), 'utf8');
}


// Endpoint to add more data to the JSON file
app.post('/add', (req, res) => {
try {
//Fetch Vector
var newText = req.body.entry;
const session_url = 'https://api.openai.com/v1/embeddings';
var config_post = {
method: 'post',
url: session_url,
headers: {
'Authorization': 'Bearer '+token ,
'Content-Type' : 'application/json'
},
data: {
"model": "text-embedding-ada-002",
"input": newText
}
};
axios(config_post)
.then(function (response) {
const jsondata = readDataFromFile();
ques_vector = response.data["data"][0].embedding;
var len = jsondata.length;
//Format Data for Pushing
console.log(len);
console.log(jsondata[len - 1].id);
var entry = {
id : jsondata[len - 1].id + 1,
text : newText,
embedding : ques_vector
}
//Push data to JSON
const existingData = readDataFromFile();
existingData.push(entry);
saveDataToFile(existingData);
res.status(201).json({ message: 'Data added successfully.' });
//End of Push Data
})
.catch(function (error) {
//Error
console.log(error);
res.send(error);
});
} catch (err) {
res.status(500).json({ error: 'Failed to add data.' });
}
});

// Endpoint to display the content of the JSON file
app.get('/display', (req, res) => {
try {
const data = readDataFromFile();
res.status(200).json(data);
} catch (err) {
res.status(500).json({ error: 'Failed to retrieve data.' });
}
});

// Endpoint to delete content from the JSON file
app.delete('/delete/:index', (req, res) => {
try {
const indexToDelete = parseInt(req.params.index);
const existingData = readDataFromFile();

if (indexToDelete < 0 || indexToDelete >= existingData.length) {
return res.status(400).json({ error: 'Invalid index.' });
}

existingData.splice(indexToDelete, 1);
saveDataToFile(existingData);
res.status(200).json({ message: 'Data deleted successfully.' });
} catch (err) {
res.status(500).json({ error: 'Failed to delete data.' });
}
});

// Call the Open AI
app.post('/askai', function (req, res) {
const data = readDataFromFile();
//var data = require('./data');
var ques = req.body.ques;
let table = [];
let tabrow = [];
let top3 = [];
//Fetch Vector
const session_url = 'https://api.openai.com/v1/embeddings';
var config_post = {
method: 'post',
url: session_url,
headers: {
'Authorization': 'Bearer '+token ,
'Content-Type' : 'application/json'
},
data: {
"model": "text-embedding-ada-002",
"input": ques
}
};
//Call
axios(config_post)
.then(function (response) {
ques_vector = response.data["data"][0].embedding;
var len = data.length;
//Check Similarities
for (let i = 0; i < len; i++) {
tabrow = [data[i].text, similarity(ques_vector, data[i].embedding)];
table.push(tabrow);
}
//Sort
table = table.sort((a, b) => a[1] - b[1]);
console.log("Question : "+ques);
console.log("Similarity Check");
//Sort Descending
console.log(table.reverse());
console.log("Top 3 Matches");
console.log(table[0]);
console.log(table[1]);
console.log(table[2]);
//Top 3
top3.push(table[0]);
top3.push(table[1]);
top3.push(table[2]);
res.send(top3);

})
.catch(function (error) {
//Error
console.log(error);
});
//
})

// Helper function to read data from the JSON file
function readDataFromFile() {
try {
const data = fs.readFileSync(dataFilePath, 'utf8');
return JSON.parse(data);
} catch (err) {
// If the file doesn't exist or is empty, return an empty array
return [];
}
}

// Helper function to save data to the JSON file
function saveDataToFile(data) {
fs.writeFileSync(dataFilePath, JSON.stringify(data, null, 2), 'utf8');
}

// Start the server
app.listen(PORT, () => {
console.log(`OpenAI API is running on port ${PORT}`)
})

Let's Quickly Test Our Code


Start the Service


Open Terminal & do npm start to start our API Service.
npm start


 

Let's train our Model (In a way)


Step 1:  Open Postman, and click on Add Button (+).

Step 2: Use the below URL:
http://localhost:8080/add

Step 3: Choose POST Call, in Auth use the Configured Basic Auth (Showed in Last Episode [Link]) & add the below Body in Body-->Raw->JSON.
{
"entry":"Need Help"
}

Step 4: Click on send, and you should get a success message & 201 Status code.


Here basically we are Training to Recognize Need-help Scenarios. Where Training JSON format is :
{
"entry":"Text/Phrase to be Trained"
}

Train 3 More scenarios the same way.

Travel, Finance, IT Support​

 


Let's Check if everything is Saved


Step 1:  In Postman, click on Add Button (+).

Step 2: Use the below URL:
http://localhost:8080/diplay

Step 3: Execute GET Call by setting Basic Auth in the Authorization Tab. We will be able to see all the trained text.



Let's test the Tool Now


Step 1:  In Postman, and click on Add Button (+).

Step 2: Use the below URL:
http://localhost:8080/askai

Step 3: Choose POST Call, in Auth use the Configured Basic Auth (Showed in Last Episode [Link]) & add the below Body in Body-->Raw->JSON.
{
"ques":"My Laptop isn't working"
}

Step 4: Click on send, and you should get the following result & 200 Status code.


AI Service is able to Identify the top 3 closest matches:

Best match: "IT Support"
"Need Help" as we needed help while the "Travel" Score goes below 80%, but it shows that it will only search the Knowledge Base we added and AI found "Travel" to be closer than "Finance".

Where input JSON format is :
{
"ques":"/Your Query/"
}

Step 4: Click on send, and you should get a success message & 200 Status code.

Let's Delete


Step 1: Run display.


Step 2: We will delete Travel, which has ID 2. Use Below URL :
http://localhost:8080/delete/2

Step 3: Choose DELETE, set Authorization, and click on SEND.


We will get a success message with 200 Status Code.

Step 4: Let's confirm via Read/Display.
http://localhost:8080/diplay


"Travel" has been successfully deleted.

Let’s Summarize


We created our OpenAI-based NodeJs API Service that can be deployed on BTP Cloud Foundry. We used the Embedding service of OopenAI & created vector maps. Further, we tested the same locally using the Postman Tool. That’s all for this episode, will meet you guys in Episode 5.

Next Episode : Epidosde 5

 

 

 

 

 

 

 
Labels in this area