I want to do custom validation for node https certificate verification. So, in https options I have set rejectUnauthorized property to false.
var httpsOptions = {
...
rejectUnauthorized: false,
...
};
Now even if certificate verification internally fails, request won't fail. I want to handle that part manually. I wanted to handle unhandled critical extension error. My code to do so is,
var req = https.request(httpsOptions, (res) => {
var data = '';
res.on('data', (chunk) => {
data += chunk;
});
res.on('end', () => {
console.log(data);
});
});
req.on("socket", function () {
req.socket.on('secureConnect', () => {
if(!req.socket.authorized){
if(req.socket.authorizationError === 'unhandled critical extension'){
// Place to verify extensions
}
process.nextTick(() => {req.abort();});
}
});
});
req.write(JSON.stringify(requestObj));
req.end();
The above code works as expected. I can say when unhandled critical extension error occurs. Inside the if condition(Place to verify extensions), I want to see what are all critical extensions that are unhandled. If it didn't match the list I have, I want to abort the request. req.socket has so many properties, so I could not paste here. There is no field in it, which holds those unhandled critical extensions. How to extract the unhandled critical extensions which caused the error?
Note: I have seen some npm packages which could parse ssl certificates like x509 and PKIjs. It gives lots of confusion and I could not find any working example which can solve my problem.
EDIT:
req.socket.getPeerCertificate().raw gives the DER certificate in Buffer format. How to decode it and view those extensions?
First we need to get the server certificate. We can use req.socket.getPeerCertificate() to get it. It returns a JSON object but it won't have extensions section. But, it will be having the entire certificate in buffer format in its raw property.
var certInBuf = req.socket.getPeerCertificate().raw;
There are a lot of packages available to convert this buffer format to pem format. One of them is pemtools
var certInPem = pemtools(certInBuf, 'CERTIFICATE').toString();
Then we can use node-forge to parse the certificate and extract the extensions.
var pki = require('node-forge').pki;
var extensions = pki.certificateFromPem(certInPem).extensions;
Then we can validate the extensions. If not satisfied we can abort the request by calling req.abort()
Related
I'm currently attempting to setup an XMPP client using Stanza.js
https://github.com/legastero/stanza
I have a working server that accepts connections from a Gajim client, however when attempting to connect using Stanza.js client.connect method, the server opens up a websocket connection, but no events for authentication, or session started ever fire.
The server logs do not show any plaintext password authentication attempts.
How can I actually see any of the stanza logs to debug this issue?
import * as XMPP from 'stanza';
const config = { credentials: {jid: '[jid]', password: '[password]'}, transports: {websocket: '[socketurl]', bosh: false} };
const client = XMPP.createClient(config)
client.on('raw:*', (data) => {
console.log('data', data)
})
client.connect();
onconnect event does fire, but this is the only event that fires.
Is there a way to manually trigger authentication that isn't expressed in the documentation?
The raw event handler should be able to give you the logging you want - but in your code sample, you are invoking it incorrectly. Try the following.
client.on('raw:*', (direction, data) => {
console.log(direction, data)
})
For reference, the docs state that the callback for the raw data event handler is
(direction: incoming | outgoing, data: string) => void
So the data that you are looking for is in the second argument, but your callback only has one argument (just the direction string "incoming" or "outgoing", although you have named the argument "data").
Once you fix the logging I expect you will see the stream immediately terminates with a stream error. Your config is incorrect. The jid and password should be top level fields. Review the stanza sample code. For the options to createClient - there is no credentials object. Try the following:
const config = { jid: '[jid]', password: '[password]', transports: {websocket: '[socketurl]', bosh: false} };
Since your username and password are hidden behind an incorrect credentials object, stanza.io does not see them and you are effectively trying to connect with no username and password so no authentication is even attempted.
This issue happened to be caused by a configuration problem.
The jabber server was using plain authentication.
Adding an additional line to the client definition file helped.
Also adding
client.on('*', console.log)
offered more complete server logs.
client.sasl.disable('X-OAUTH2')
How can I actually see any of the stanza logs to debug this issue?
If the connection is not encrypted, you can sniff the XMPP traffic with tools like
sudo tcpflow -i lo -Cg port 5222
You can force ejabberd to not allow encryption, so your clients don't use that, and you can read the network traffic.
Alternatively, in ejabbed.yml you can set this, but probably it will generate a lot of log messages:
loglevel: debug
I am attempting to get this tutorial (here: https://www.hellorust.com/demos/add/index.html) to work, and it seems that whatever I do, I cannot get the WebAssembly MDN reserved function to properly work.
So, I followed the instructions on the link above and got an add.wasm file. As far as I can tell this should be fairly simple and should work. After a little digging I found that the newest WebAssembly module is to instantiate streaming - the documentation for which can be found here: (https://developer.mozilla.org/en-US/docs/WebAssembly/Using_the_JavaScript_API).
The MDN example says to do the following:
var importObject = {
imports: { imported_func: arg => console.log(arg) }
};
then
WebAssembly.instantiateStreaming(fetch('simple.wasm'), importObject)
.then(obj => obj.instance.exports.exported_func());
According to MDN the importObject is to unwrap the nested argument. Weird, but OK.
To make this as simple as possible I put the add.wasm file and the js file that would import it in the same directory and then did then following (NOTE: I am using Vue.js, but for anyone familiar with SPA like libraries this should be similar):
window.WebAssembly.instantiateStreaming(fetch('./add.wasm', {
headers: {
"Content-Type": "application/wasm",
},
}), importObject)
.then(obj => {
console.log('inside return obj from WebAssembly initiateStreaming')
obj => obj.instance.exports.exported_func()
})
.catch(error=>{
console.log('there was some error; ', error)
});
The error I get back is:
there was some error; TypeError: "Response has unsupported MIME type"
I've tried not adding the header to the fetch request, using fetch(add.wasm), dropping the window., dropping the importObject entirely and simple logging obj to console. Nothing appears to work.
It may be that I have to add the application/wasm field to webpack somehow if it is not widely supported, but I'm not sure and I haven't seen any examples online.
Does anyone know how to get this to work?
EDIT:
Someone suggested that since this was a fetch request it had to be making the request from a backend server. This made sense to me, so I did the following:
WebAssembly.instantiateStreaming(fetch('http://localhost:8000/files/add.wasm'), importObject)
.then(obj => {
console.log('inside return obj from WebAssembly initiateStreaming')
obj => obj.instance.exports.exported_func()
})
.catch(error=>{
console.log('there was some error; ', error)
});
Where http://localhost:8000/files/{someFile} is a backend route that serves my files (which I made sure to put add.wasm in of course). Unfortunately, I get the same error (i.e. unrecognized MIME type) and I'm not sure why.
Considering you can't change the server to properly return application/wasm for .wasm file requests for any reason, you can work around the issue by changing the way you instantiate the WebAssembly module. Instead of doing this:
WebAssembly.instantiateStreaming(fetch("./add.wasm")).then(obj => /* ... */)
Do this:
const response = await fetch("add.wasm");
const buffer = await response.arrayBuffer();
const obj = await WebAssembly.instantiate(buffer);
obj.instance.exports.exported_func();
Or the equivalent using then() if you cannot use async/await.
In practice, what my workaround does is to avoid calling instantiateStreaming(), which must check the MIME type returned by the server before proceeding (according to this specification). Instead, I call instantiate() passing an ArrayBuffer and avoid the check altogether.
there was some error; TypeError: "Response has unsupported MIME type"
The Web server you are running does not understands/serves a MIME type application/wasm.
You can use a rust based http server, it knows about wasm MIME type.
Installation
Simply use curl
curl -SsL https://cdn.rawgit.com/thecoshman/http/master/install.sh | sh
and execute the downloaded script or you can explorer other ways to do the same at https://crates.io/crates/https.
Running
Please use the downloaded server to server your Web Application(index.html).
e.g
cd ${YOUR_APPS_PATH}
http
A snippet of code for a workaround has been published on the WebAssembly Git here. Unfortunately, this is a workaround, and this defeats the purpose of instantiateStreaming() which is told here to be "a lot more efficient", since the workaround needs an ArrayBuffer that instantiateStreaming() helps to avoid.
James Wilson's "wasm-fractal" project deals with the error, like this:
importScripts("wasm_fractal.js");
delete WebAssembly.instantiateStreaming;
wasmFractal("./wasm_fractal_bg.wasm").then((wasm) => {
// establish connection between wasm and javascript
});
I use the delete WebAssembly.instantiateStreaming; trick myself during development, since my editor's builtin server, serves wasm with the incorrect mime type.
I'm writing a web application in HTML/Javacript that records audio and uploads it to a server. Now, I would like to put it also in cache so it's available to service.workers for an offline scenario. What's the best way to do this?
Program flow:
Record audio
Capture data in a Blob
Save data on server
Listen to recorded stuff
If you are online, of course, all works well.
I would like to have the file locally available for listening before it is remotely saved, and backup it on server ASAP.
This is my routine:
function mettiincache(name, myBlob) {
var resp = new Response(myBlob)
var CACHE_NAME = 'window-cache-v1';
caches.open(CACHE_NAME).then(function (cache) {
cache.put(name, resp)
}).catch(function (error) {
ChromeSamples.setStatus(error);
});
}
When I look in Application/cache storage using Chrome DevTools, I find an entry with the correct path/name and content-Type, but with a content-Length of 0 bytes
Note that you might create / use a separate worker 'audioWorker.js' from the SW.js for running the apps audio cache because IMO its easier to test and the SW lifecycle is pretty involved and pretty oriented to its own app cache of 'real' urls used by the app.
Also note an inconsistency with allowable protocols used in the normal Service-Worker implementation that intercepts calls to 'fetch' - blob protocol used by the browser on your audio blobs will be rejected as invalid Urls by the browser.SW implementation. You cannot simply feed your blobs url to the normal SW lifecycle because its URL starts with 'blob://'.
The url from the audioBlob is fine if you choose NOT to use a SW for the cache. However, you might want to suffix it with a mimeType...
url = URL.createObjectURL(audio); // protocol of this is 'blob://'
wUrl = url +"?type=" + {$audio.blob.data.type};
console.log("SW CACH1 " +wUrl);
myCacheWorker.postMessage({ action: 'navigate', url: wUrl });
in the cacheWorker, onMessage , write to the cache:
onmessage = function( e ){
switch( e.data.action ){
case 'navigate':
upcache(e.data.url).then(() => {
postMessage({done: 'done'});
});
break;
}}
//boiler plate cache write below from any example
var upcache = function( url ){
return caches.open($cname)
.then((openCache) => {
return fetch(fetchUrl).then(function(resp) {
if (!resp.ok) {
throw new TypeError('Bad response status');
}
return openCache.put(url, resp);
})
});
}
you can use sql lite to store data in browser , there is a punch of tools that might help you in development to do that ,
i am using this tool my self , https://sqlitebrowser.org/ for debugging, testing and reading data from browsers ,
you can use Data series and publish it to client side as well .
you may refer to this link also reference how to use sql lite in browser , are you storing audio files as binary files? ,
generally , sql lite is good but you have to take care of storing sensitive data without encrypting it other wise it will be compromised also you may use Indexed Database API v 2.0 .
here is a link for more information about this .
https://www.w3.org/TR/IndexedDB/
I've followed this tutorial's code (https://dialogflow.com/docs/getting-started/basic-fulfillment-conversation) to return results of an API to dialog flow. However my webhook keeps failing. Can someone help me figure out why?
Here's one of the failed conversations:
Here's my code:
'use strict';
const http = require('http');
exports.Hadoop = (req, res) => {
// Get name node server from the request
let nameNodeServer = req.body.queryResult.parameters['nameNodeServer']; // nameNodeServer is a required param
// Call the Hadoop API
getNameNodeInfo(nameNodeServer).then(function(output) {
res.json({ 'fulfillmentText': output }); // Return the results to Dialogflow
}).catch(() => {
res.json({ 'fulfillmentText': 'getNameNodeInfo() Error'- });
});
};
function getNameNodeInfo (nameNodeServer) {
return new Promise((resolve, reject) => {
// Create url for the HTTP request to get the name node info
let url = 'http://' + nameNodeServer + '[rest of url]';
// Make the HTTP request to get the name node info
http.get(url, (res) => {
let body = ''; // var to store the response chunks
res.on('data', (chunk) => {body += chunk; });
res.on('end', () => {
// After all the data has been received, parse the JSON for desired data
let response = JSON.parse(body);
let beans = response['beans'][0];
// Create response
let output = `Percent Used: ${beans['PercentUsed']}`;
// Resolve the promise with the output text
console.log(output);
resolve(output);
});
res.on('error', (error) => {
console.log(`Error calling the Hadoop API: ${error}`);
reject();
});
});
});
}
I believe the getNameNodeInfo function and the retrieval of the name node server are correct, as they logged the correct output in debugging.
Diagnostic Info:
I contacted someone at Dialogflow and this was their response.
Thank you for providing all the information. I have observed in your
code that you have used http requests instead of https. The service
must use HTTPS and the URL must be publicly accessible in order for
the fulfillment to function. Dialogflow does not support self-signed
SSL certs. For information on SSL setup, please refer to this :
https://developers.google.com/web/fundamentals/security/encrypt-in-transit/enable-https
We've had a somewhat different, but related, issue:
Internal Server Error when running an agent.
“status”: {
“code”: 500,
“errorType”: “internal_server_error”,
“errorDetails”: “Internal Server Error”
},
This error was not caused by any changes we introduced. We are using that agent in a dev version of an app and one morning it stopped working.
We tested by creating a .zip and restoring into a new agent. The new agent would work properly, but we would continue to get the 500 error on the agent hooked into our dev app. We submitted a help request and overnight the error got resolved. We suspect that DialogFlow team had to manually reboot the server or something similar.
How to handle etimedout error on this call ?
var remotePath = "myremoteurltocopy"
var localStream = fs.createWriteStream("myfil");;
var out = request({ uri: remotePath });
out.on('response', function (resp) {
if (resp.statusCode === 200) {
out.pipe(localStream);
localStream.on('close', function () {
copyconcurenceacces--;
console.log('aftercopy');
callback(null, localFile);
});
}
else
callback(new Error("No file found at given url."), null);
})
There are a way to wait for longer? or to request the remote file again?
What exactly can cause this error? Timeout only?
This is caused when your request response is not received in given time(by timeout request module option).
Basically to catch that error first, you need to register a handler on error, so the unhandled error won't be thrown anymore: out.on('error', function (err) { /* handle errors here */ }). Some more explanation here.
In the handler you can check if the error is ETIMEDOUT and apply your own logic: if (err.message.code === 'ETIMEDOUT') { /* apply logic */ }.
If you want to request for the file again, I suggest using node-retry or node-backoff modules. It makes things much simpler.
If you want to wait longer, you can set timeout option of request yourself. You can set it to 0 for no timeout.
We could look at error object for a property code that mentions the possible system error and in cases of ETIMEDOUT where a network call fails, act accordingly.
if (err.code === 'ETIMEDOUT') {
console.log('My dish error: ', util.inspect(err, { showHidden: true, depth: 2 }));
}
In case if you are using node js, then this could be the possible solution
const express = require("express");
const app = express();
const server = app.listen(8080);
server.keepAliveTimeout = 61 * 1000;
https://medium.com/hk01-tech/running-eks-in-production-for-2-years-the-kubernetes-journey-at-hk01-68130e603d76
Try switching internet networks and test again your code. I got this error and the only solution was switching to another internet.
Edit: I now know people besides me that have had this error and the solution was communicating with the ISP and ask them to chek the dns configuration because the http request were failing. So switching networks definitely could help with this.
That is why I will not delete the post. I could save people a few days of headaches (especially noobs like me).
Simply use a different network. Using a different network solved this issue for me within seconds.