Google cloud dialogflow intent detection nodejs example not working - javascript

I am trying to implement a very simple dialogflow agent integration with nodejs.
Here is what I did so far
I followed the code from Intent detection
I added the service account private key file .json to my server.
I added the environment variable GOOGLE_APPLICATION_CREDENTIALS with the path to my .json private key file.
Here is the code I am trying to run right now:
require('dotenv').config()
const projectId = 'gg-chatbot-216808';
const sessionId = 'quickstart-session-id';
const query = 'hello';
const languageCode = 'en-US';
// Instantiate a DialogFlow client.
const dialogflow = require('dialogflow');
const sessionClient = new dialogflow.SessionsClient();
// Define session path
const sessionPath = sessionClient.sessionPath(projectId, sessionId);
// The text query request.
const request = {
session: sessionPath,
queryInput: {
text: {
text: query,
languageCode: languageCode,
},
},
};
// This prints the private key path correctly.
console.log(process.env.GOOGLE_APPLICATION_CREDENTIALS);
// Send request and log result
sessionClient
.detectIntent(request)
.then(responses => {
console.log('Detected intent');
const result = responses[0].queryResult;
console.log(` Query: ${result.queryText}`);
console.log(` Response: ${result.fulfillmentText}`);
if (result.intent) {
console.log(` Intent: ${result.intent.displayName}`);
} else {
console.log(` No intent matched.`);
}
})
.catch(err => {
console.error('ERROR:', err);
});
Then I get this error in the console when I run this file
Auth error:Error: invalid_user: Robot is disabled.
ERROR: { Error: 14 UNAVAILABLE: Getting metadata from plugin failed with error: invalid_user: Robot is disabled.
at Object.exports.createStatusError (/var/www/html/google_auth/node_modules/grpc/src/common.js:87:15)
at Object.onReceiveStatus (/var/www/html/google_auth/node_modules/grpc/src/client_interceptors.js:1188:28)
at InterceptingListener._callNext (/var/www/html/google_auth/node_modules/grpc/src/client_interceptors.js:564:42)
at InterceptingListener.onReceiveStatus (/var/www/html/google_auth/node_modules/grpc/src/client_interceptors.js:614:8)
at callback (/var/www/html/google_auth/node_modules/grpc/src/client_interceptors.js:841:24)
code: 14,
metadata: Metadata { _internal_repr: {} },
details: 'Getting metadata from plugin failed with error: invalid_user: Robot is disabled.' }

i also faced a similar issue for my angular bot.
What i did was, instead of using using the google_credentials from the json file, i created an object with private_key,client_email {these values can be taken from the service account private key file .json}, and passed the object while setting up the session client.
var config = {
credentials: {
private_key: "YOUR_PRIVATE_KEY",
client_email: "YOUR_CLIENT_EMAIL"
}
}
const sessionClient = new dialogflow.SessionsClient(config);
note: do copy the full private_key string from .json. It will start as "-----BEGIN PRIVATE KEY-----\n......" .
Also, in GCP go to the project->IAM then try setting role for the service as DIALOGLOW API ADMIN. Check if this works.

If this has not been resolved yet , the solution is to provide "fileKey" inside sessionClient.
const sessionClient = new dialogflow.SessionsClient({
fileKey:" path of your credentials.json file"
});
or
let filePath = process.env.GOOGLE_APPLICATION_CREDENTIALS ="Location of credentials file".
const sessionClient = new dialogflow.SessionsClient({
fileKey:filePath
});
this will even work if there is no system env variable is set as GOOGLE_APPLICATION_CREDENTIALS.
Hope this is helpful.

Related

agora start method error : post method api body check failed

I'm building a video-calling app using Next js and agora.io 4, I followed the steps mentioned in the Docs.
I enabled agora cloud recording
called the acquire method and got the resourceId.
Then, I called the start method. but it always failed with an error post method API body check failed!
However, it works perfectly on Postman.
Here's the code :
import axios from "axios";
import chalk from "chalk";
// AWS S3 storage bucket credentials
const secretKey = process.env.S3_SECRET_KEY;
const accessKey = process.env.S3_ACCESS_KEY;
const bucket = process.env.S3_BUCKET_NAME;
const region = process.env.S3_BUCKET_REGION;
const vendor = process.env.S3_VENDOR;
//agora credentials
const appId = process.env.APP_ID;
const key = process.env.KEY;
const secret = process.env.SECRET;
export default async function startHandler(req, res) {
//call agora start method
const { uid, cname, resourceId, token } = req.body;
const plainCredential = `${key}:${secret}`;
const encodedCredential = Buffer.from(plainCredential).toString("base64"); // Encode with base64
const authorizationField = `Basic ${encodedCredential}`;
const data = {
uid,
cname,
clientRequest: {
recordingConfig: {
streamMode: "standard",
channelType: 0,
subscribeUidGroup: 0,
},
storageConfig: {
accessKey,
region,
bucket,
secretKey,
vendor,
},
},
};
const headers = {
"Content-Type": "application/json",
Authorization: authorizationField,
};
const startUrl = `https://api.agora.io/v1/apps/${appId}/cloud_recording/resourceid/${resourceId}/mode/individual/start`;
try {
const response = await axios.post(startUrl, data, {
headers,
});
res.status(200).send(response.data);
} catch (error) {
console.error(error);
res.send(error);
}
}
Any help/hint would be much appreciated
I found the fix!
First, you may be tricked by the uid returned from the agora join method, it's returning a Number, surprisingly! the start method
expect the uid to be a string, so don't forget to do a
uid.toString().
In the storageConfig object, you should check the type of each of its attributes. each of region and vendor is expected to be of type Number. That said, if you're storing this info in a .env file, remember that environment files only stores strings. Therefore, you should convert them to Numbers!
This problem took me 2 days, so I hope this will be useful for you!

Analytics.js data to PubSub GCP

I am using analytics.js to make a custom tracking over my website. As I wish to send the hit to PubSub, I used this documentation (Node.JS tab) to connect my TypeScript code to PubSub (not perfect I know. I am trying to make it work before cleaning).
ga(() => {
ga("set", "cookieExpires", 0);
const tracker = ga.getByName(trackerName);
tracker.set("sendHitTask", (model: any) => {
var refusedParam = ["_gid", "tid"];
let hit = model.get("hitPayload").split("&").filter((paramValue: string) => {
let param = paramValue.split("=")[0];
return (refusedParam.indexOf(param) == -1);
}).join("&");
/**
* TODO(developer): Uncomment these variables before running the sample.
*/
const topicNameOrId = 'tracking-test';
const data = JSON.stringify(hit);
// Creates a client; cache this for further use
const pubSubClient = new PubSub();
console.log("DATA IS " + data);
async function publishMessage() {
// Publishes the message as a string, e.g. "Hello, world!" or JSON.stringify(someObject)
const dataBuffer = Buffer.from(data);
try {
const messageId = await pubSubClient
.topic(topicNameOrId)
.publishMessage({data: dataBuffer});
console.log(`Message ${messageId} published.`);
} catch (error) {
console.error(`Received error while publishing: ${error.message}`);
process.exitCode = 1;
}
}
publishMessage();
});
});
I don't have any error when building and running this code. But, when I locally connect to my website, I have the following error inside the JS console Uncaught TypeError: a.grpc is undefined.
I tried to put grpc inside my package.json, but no success at removing the error and having a correct behavior.
Did I miss something ? How can I use analytics.js, and send data directly to PubSub ?

How to fetch Amazon Cognito Identity ID (user_identity_id) for the user from the lambda function?

In the Amplify documentation, under the Storage/File access levels section there is a paragraph that states:
Files are stored under private/{user_identity_id}/ where the user_identity_id corresponds to the unique Amazon Cognito Identity ID for that user.
How to fetch user_identity_id from the lambda function?
Request to the lambda is authorized, the event.requestContext.authorizer.claims object is available, I can see the user data, but not the user_identity_id.
EDIT: Now I see that there is a field event.requestContext.identity.cognitoIdentityId, but the value is null. Still need to find the way to fetch it.
Ok, so there's no right way to map Cognito identity ID and Cognito user. There is a lengthy discussion here where a couple of workarounds can be found. For now, I'm going to use this solution where, instead of identity_id, you can specify a custom attribute (most likely a sub) as a folder name.
EDIT: There is another solution that might help (found somewhere on the internet, and I verified that it works)
const AWS = require('aws-sdk')
const cognitoIdentity = new AWS.CognitoIdentity();
function getCognitoIdentityId(jwtToken) {
const params = getCognitoIdentityIdParams(jwtToken);
return cognitoIdentity
.getId(params)
.promise()
.then(data => {
if (data.IdentityId) {
return data.IdentityId;
}
throw new Error('Invalid authorization token.');
});
}
function getCognitoIdentityIdParams(jwtToken) {
const loginsKey = `cognito-idp.${process.env.REGION}.amazonaws.com/${process.env.USERPOOLID}`;
return {
IdentityPoolId: `${process.env.IDENTITY_POOL_ID}`,
Logins: {
[loginsKey]: jwtToken,
},
};
}
If the user accesses the lambda through graphql via the AppSync service then the identity is stored event.identity.owner
Here is some typescript code I use to pull the user_identity_id from the event. However, the user doesn't always call the lambda direct sp the user_identity can also be based in if from an authorized IAM role.
export function ownerFromEvent(event: any = {}): string {
if (
event.identity.userArn &&
event.identity.userArn.split(":")[5].startsWith("assumed-role")
) {
// This is a request from a function over IAM.
return event.arguments.input.asData.owner;
} else {
return event.identity.owner;
}
}
For anyone else still struggling with this, I was finally able to use the aws-sdk for JavaScript v3 to obtain a Cognito User's IdentityId & Credentials in a Lambda Function invoked via API-Gateway with a Cognito User Pool Authorizer from the Cognito User's identity jwtToken passed into the Authorization header of the request.
Here is the code used in my JavaScript Lambda Function:
const IDENTITY_POOL_ID = "us-west-2:7y812k8a-1w26-8dk4-84iw-2kdi849sku72"
const USER_POOL_ID = "cognito-idp.us-west-2.amazonaws.com/us-west-2_an976DxVk"
const { CognitoIdentityClient } = require("#aws-sdk/client-cognito-identity");
const { fromCognitoIdentityPool } = require("#aws-sdk/credential-provider-cognito-identity");
exports.handler = async (event,context) => {
const cognitoidentity = new CognitoIdentityClient({
credentials: fromCognitoIdentityPool({
client: new CognitoIdentityClient(),
identityPoolId: IDENTITY_POOL_ID,
logins: {
[USER_POOL_ID]:event.headers.Authorization
}
}),
});
var credentials = await cognitoidentity.config.credentials()
console.log(credentials)
// {
// identityId: 'us-west-2:d393294b-ff23-43t6-d8s5-59876321457d',
// accessKeyId: 'ALALA2RZ7KTS7STD3VXLM',
// secretAccessKey: '/AldkSdt67saAddb6vddRIrs32adQCAo99XM6',
// sessionToken: 'IQoJb3JpZ2luX2VjEJj//////////...', // sessionToken cut for brevity
// expiration: 2022-07-17T08:58:10.000Z
// }
var identity_ID = credentials.identityId
console.log(identity_ID)
// us-west-2:d393294b-ff23-43t6-d8s5-59876321457d
const response = {
statusCode: 200,
headers: {
"Access-Control-Allow-Headers": "*",
"Access-Control-Allow-Origin": "*",
"Access-Control-Allow-Methods" : "OPTIONS,POST,GET,PUT"
},
body:JSON.stringify(identity_ID)
};
return response;
}
After a Cognito User has signed in to my application, I can use the Auth directive of aws-amplify and fetch() in my React-Native app to invoke the lambda function shown above by sending a request to my API-Gateway trigger (authenticated with a Cognito User Pool Authorizer) by calling the following code:
import { Auth } from 'aws-amplify';
var APIGatewayEndpointURL = 'https://5lstgsolr2.execute-api.us-west-2.amazonaws.com/default/-'
var response = {}
async function getIdentityId () {
var session = await Auth.currentSession()
var IdToken = await session.getIdToken()
var jwtToken = await IdToken.getJwtToken()
var payload = {}
await fetch(APIGatewayEndpointURL, {method:"POST", body:JSON.stringify(payload), headers:{Authorization:jwtToken}})
.then(async(result) => {
response = await result.json()
console.log(response)
})
}
More info on how to Authenticate using aws-amplify can be found here https://docs.amplify.aws/ui/auth/authenticator/q/framework/react-native/#using-withauthenticator-hoc

How to fix firebase database initialised multiple times due to React SSR initialised database and cloud function firebase initialised database?

I have updated the question as found the root cause of the issue.
As I have hosted my React SSR app which uses firebase database in the client serving by one of the cloud function named app throwing an error of Error: FIREBASE FATAL ERROR: Database initialized multiple times. Please make sure the format of the database URL matches with each database() call.. When I comment out one by one and deploy, works perfectly. But when I deploy together doesn't work. How do I separate these two keeping both at the same repo?
ORIGINAL Question: Why firebase cloud function throwing an error of 'The default Firebase app does not exist.'?
So I am trying out firebase function for the first time. admin.messaging() throwing me the following error. Help me figure out why?
If I look at the console I get results till console.log('deviceToken', deviceToken);
so whats wrong in const messageDone = await admin.messaging().sendToDevice(deviceToken, payload);?
const functions = require('firebase-functions');
const admin = require('firebase-admin');
exports.updateUnreadCount = functions.database.ref('/chats/{chatId}/{messageId}')
.onCreate(async(snap, context) => {
const appOptions = JSON.parse(process.env.FIREBASE_CONFIG);
appOptions.databaseAuthVariableOverride = context.auth;
const adminApp = admin.initializeApp(appOptions, 'app');
const { message, senderId, receiverUid } = snap.val();
console.log(message, senderId, receiverUid);
console.log('------------------------');
const deleteApp = () => adminApp.delete().catch(() => null);
try {
const db = adminApp.database();
const reciverUserRef = await db.ref(`users/${receiverUid}/contacts/${senderId}/`);
console.log('reciverUserRef', reciverUserRef);
const deviceTokenSnapshot = await reciverUserRef.child('deviceToken').once('value');
const deviceToken = await deviceTokenSnapshot.val();
console.log('deviceToken', deviceToken);
const payload = {
notification: {
title: 'Test Notification Title',
body: message,
sound: 'default',
badge: '1'
}
};
const messageDone = await admin.messaging().sendToDevice(deviceToken, payload);
console.log('Successfully sent message: ', JSON.stringify(messageDone));
return deleteApp().then(() => res);
} catch (err) {
console.log('error', err);
return deleteApp().then(() => Promise.reject(err));
}
});
Update1: According to this https://firebase.google.com/docs/cloud-messaging/send-message#send_to_a_topic, admin.messaging().sendToDevice(deviceToken, payload) APIs are only available in the Admin Node.js SDK?
So switched to
const payload = {
data: {
title: 'Test Notification Title',
body: message,
sound: 'default',
badge: '1'
},
token: deviceToken
};
const messageDone = await admin.messaging().send(payload);
Which is not working either. Getting an error Error: The default Firebase app does not exist. Make sure you call initializeApp() before using any of the Firebase services. Any lead will be helpful.
EDIT: Finally got the function working.
My index.js is exporting to functions, follwoing
exports.app = functions.https.onRequest(app); //React SSR
exports.updateChat = functions.database.ref('/chats/{chatId}/{messageId}').onCreate(updateChat);
exports.app is a react ssr function, which I am using to host my site. This uses database too. and throwing error of multiple database instance.
When I comment out one by one and deploy, works perfectly. But when I deploy together doesn't work. How do I separate these two keeping both at the same repo? Any suggestions, please?
You can initialise db outside export function.
const admin = require('firebase-admin');
const adminApp = admin.initializeApp(appOptions, 'app')
//continue code
Update:
const admin = require('firebase-admin');
const adminApp = admin.initializeApp(options);
async function initialize(options, apps = 'app') {
try {
const defaultApp = adminApp.name
if(defaultApp) {
const adminApp1 = admin.initializeApp(apps);
}else {
const adminApp1 = admin.initializeApp(options, apps);
}
}catch(err) {
console.error(err);
}
}
Modify this snippet as per your need and try it out
It abstracts initialize of app in another function. Just call this function at appropriate place in your code.

Error: Registration token(s) provided to sendToDevice()

Now im working for my final project. I try to send notification using firebase cloud function when its trigger the onUpdate but i got an error. I have follow tutorial on youtube and website but i dont get it. By the way, im new to firebase. below Here is my index.js code :-
const functions = require('firebase-functions');
//Firebase function and handling notification logic
const admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
exports.pushNotification = functions.database.ref('/Sensor').onWrite(( change,context) => {
const sensor = change.after.val();
const payload = {
notification: {
Title: "Alert",
Body: "Open pipe detect !",
icon: "default"
}
};
return admin.messaging().sendToDevice(sensor.token, payload)
.then((response)=> {
return console.log("Successfully sent message:", response);
});
});
the project structure is like this:
**water-system**
+--Sensor
+---Pipe
+---pipeName
+---solenoid
+---status // trigger on this update
+---User
+---Id1
+---email
+---name
+---token // token store by this user
+---Id2
+---Id3
+---token // also store token
So when the child node of Sensor have been update it will send notification to User who have store the token(user id1 and id3). Glad if anyone could help me to solve this problem
Try storing the tokens in this format:
"tokens" : {
"cXyVF6oUGuo:APA91bHTSUPy31JjMVTYK" : true,
"deL50wnXUZ0:APA91bGAF-kWMNxyP6LGH" : true,
"dknxCjdSQ1M:APA91bGFkKeQxB8KPHz4o" : true,
"eZunoQspodk:APA91bGzG4J302zS7sfUW" : true
}
Whenever you want to write a new token just do a set:
firebase.app().database().ref(`/user/${uid}/tokens/${token}`).set(true);
And to create an array for sendToDevice:
const tokensList = Object.keys(tokens.val());
return admin.messaging().sendToDevice(tokensList, payload);

Categories

Resources