How to ad-hoc decode/uncompress the output produced by the nock recorder so we can see the response as text? I guess we do not understand if the response is gzipped and/or encoded
The object works find when we load it into nock, and our tests are behaving as we expect. To see what the API produced, we are having to put logging statements in the implementation file.
We are recording and saving the JSON the responses:
nock.recorder.rec({output_objects: true, dont_print: true});
JSON.stringify(nock.recorder.play())
And our file looks like:
[
{
"scope": "https://some.api.com:443",
"method": "POST",
"path": "/auth?key=some_key",
"body": {
"logonId": "user#api.com",
"logonPassword": "secret"
},
"status": 400,
"response": [
"1f8b0800000000000000458cbd6ac34010067b3fc5c735691263bb741344ec42f827420a492916692d1d9cb461f71c218cdf3d97266e6786b92d00c7aaa205290d1c59cd6d71bb3fff8b376939a1cd6abd7ac003cf89b97a5f96757efecc8ef9aede9fb2fc586455f5f55eeedca33db119757f0f5704266334a2ca4d44ec19170941263f76f06657b62dd6cb2af919ec9357cc7255f0cb403e4014df643689b6687d3b3e450c149b1e534f1113a3a71f868cb8f8c04b7ca48b8fa08efcf8ea16f75fa1776d91ee000000"
],
"headers": {
"cache-control": "no-store, no-cache, must-revalidate",
"content-encoding": "gzip",
"content-type": "application/json",
"transfer-encoding": "chunked",
"connection": "Close"
}
}
]
Nock serialize compressed (gzipped) response as "hex buffer";
luckily xxd can revert the hex-buffer to binary data that can be gunzipped to get the plain json text.
in summary:
echo <YOUR-HEX-BUFFER-HERE> | xxd -r -p | gunzip
with reference to the example in question:
$ echo 1f8b0800000000000000458cbd6ac34010067b3fc5c735691263bb741344ec42f827420a492916692d1d9cb461f71c218cdf3d97266e6786b92d00c7aaa205290d1c59cd6d71bb3fff8b376939a1cd6abd7ac003cf89b97a5f96757efecc8ef9aede9fb2fc586455f5f55eeedca33db119757f0f5704266334a2ca4d44ec19170941263f76f06657b62dd6cb2af919ec9357cc7255f0cb403e4014df643689b6687d3b3e450c149b1e534f1113a3a71f868cb8f8c04b7ca48b8fa08efcf8ea16f75fa1776d91ee000000 \
> | xxd -r -p \
> | gunzip
{
"errorParameters": {},
"errorCode": 2010,
"errorKey": "_ERR_INVALID_EMAILPASSWORD",
"errorMessage": "Please correct the following issues: 1.Sorry either your e-mail or password didn't match what we have on file. Try it again?"
}
Also at the moment I'm answering there are active discussions and proposals on the nock project and may be this could change in future releases; with reference to:
https://github.com/nock/nock/issues/1212
https://github.com/nock/nock/pull/1372
https://github.com/nock/nock/issues/1212
The response from the http request is coming back as gzipped data, indicated by the content-encoding header. Nock is saving this data a hex encoded buffer string.
You can convert these cassettes into json with the following utility:
var zlib = require('zlib');
var fs = require('fs');
var argv = process.argv.slice(2);
var path = require('path');
var filename = path.resolve(argv[0]);
var file = fs.readFileSync(filename, { encoding: 'utf8' });
var cassettes = JSON.parse(file);
cassettes.forEach(function (cassette) {
if (cassette.headers['content-encoding'] !== 'gzip') {
return;
}
var response = new Buffer(cassette.response[0], 'hex');
var contents = zlib.gunzipSync(response).toString('utf8');
cassette.response = JSON.parse(contents);
delete cassette.headers['content-encoding'];
});
fs.writeFileSync(filename, JSON.stringify(cassettes, null, 2), { encoding: 'utf8' });
Note, this will overwrite the original cassette with one that has converted all gzip requests into json. Also note that I'm not checking content type, so you'll need to adapt this if you have responses that aren't json.
Use jq with xxd to extract and decode the response field:
jq -r .response file.json | xxd -r -p | gunzip
A little late to the party, but using Franco Rondini's and ChiperSoft's answer I came up with this:
const zlib = require('zlib');
// Goes from a hex representation of gzipped binary data to an object
module.exports.decode = input => {
if (typeof input.join === 'function') {
input = input.join('');
}
const tempBuffer = Buffer.from(input, 'hex');
const unzippedBuffer = zlib.gunzipSync(tempBuffer);
const contents = unzippedBuffer.toString('utf8');
return JSON.parse(contents);
};
// Goes from an object to a zipped buffer encoded in hex
module.exports.encode = input => {
const inputAsString = JSON.stringify(input);
const tempBuffer = Buffer.from(inputAsString);
const zippedBuffer = zlib.gzipSync(tempBuffer);
return zippedBuffer.toString('hex');
};
This is probably not perfect but it was helpful to be able to replace replies on the fly with objects.
Related
I have to convert js code to python.
The js code performs a file upload via a POST request, using fetch().
This is the js code:
<input type="file" />
<button onclick="upload()">Upload data</button>
<script>
upload = async() =>
{
const fileField = document.querySelector('input[type="file"]');
await uploadDoc(fileField.files[0] );
};
uploadDoc = async( file ) =>
{
let fd = new FormData();
fd.append( 'file', file );
fd.append( 'descr', 'demo_upload' );
fd.append( 'title', name );
fd.append( 'contentType', 'text' );
fd.append( 'editor', user );
let resp = await fetch( url, { method: 'POST', mode: 'cors', body: fd });
};
</script>
The code works and complies with the fetch() docs, provided here:
https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API/Using_Fetch#uploading_a_file
Now when i try to recreate this in python, i get a 500 HTTP Status Code
This is the python code:
from urllib import request
from urllib.parse import urlencode
import json
with open('README.md', 'rb') as f:
upload_credentials = {
"file": f,
"descr": "testing",
"title": "READMEE.md",
"contentType": "text",
"editor": username,
}
url_for_upload = "" #here you place the upload URL
req = request.Request(url_for_upload, method="POST")
form_data = urlencode(upload_credentials)
form_data = form_data.encode()
response = request.urlopen(req, data=form_data)
http_status_code = response.getcode()
content = response.read()
print(http_status_code)
print(content)
This doesn't work however and i get this error:
raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 500:
Someone with both js and python experience might be able to see what is wrong in the python side, or how to convert the fetch() function to python.
I think the problem is that you're reading the file in binary mode (rb). You need just r:
with open('README.md', 'r') as f: # r instead of rb
However, I still reccomend the requests module, which is more widely used, and is easier in this case.
import requests # pip install requests
url = "https://www.example.com/upload_file"
headers = {
"content-type": "text/plain" # force text file content type
}
files = {
"my_file": ("FILE_NAME.md", open("README.md","r")) # tuple containing file name, and io.BytesIO file buffer
}
data = { # in case you want to send a request payload too
"foo": "bar"
}
r = requests.post(url, headers=headers, files=files, data=data)
if r.status_code == 200:
print(r.text) # print response as a string (r.content for bytes, if response is binary like an image)
In Nodejs, I'm getting response from an api
{
"file": "PHN0eWxlPnRlRrU3VRbUNDJyAvPjwvcD4K",
"mime_type": "text/html",
"document_type": "shippingLabel"
}
To reconstruct the file, the data from the node needs to be base64 decoded, and interpreted according to the mime_type.
Help me to get the file in .pdf and save to directory.
Using fs.writeFileSync(file, data[, options]):
const fs = require('fs');
// get your response somehow...
const response = {
file: 'PHN0eWxlPnRlRrU3VRbUNDJyAvPjwvcD4K',
mime_type: 'text/html',
document_type: 'shippingLabel'
};
// LUT for MIME type to extension
const ext = {
'text/html': 'html',
// ...
}
// save to shippingLabel.html
fs.writeFileSync(`${response.document_type}.${ext[response.mime_type]}`, response.file, 'base64');
I have a jenkins job that takes a file as an input. The job reads the input file and then processes the job.
I am trying to run a jenkins job from node js script. I am able to run the particular job using the jenkins api for npm.
My problem is that I am not able to run the job by passing the file required.
In Jenkins, I have to find the particular job and then migrate to it and then click "Build with parameters" and then select the file and build.
I am trying to uncomplicate this by having a react UI that takes a file as input and submits it to the node js script. Now that script has to upload the file to Jenkins and then build the job.
Node js code is,
var file0 = '/Users/m0a00pf/Documents/react-js/asda/src/files/APA.csv';
exports.buildJenkinsJob = function buildJenkinsJob(){
jenkins.job.build({"name":"Create a job",
"parameters":
{
"name": "\\src\\main\\resources\\com\\asda\\qa\\data\\APA\\APA.csv", "file": file0
}
}
,function(err, data){
if(err)
throw err;
else
console.log(data);
});
}
The parameters part is not working. when I run.
jenkins.job.build({"Create a job"});
this works fine.
Changed the options as,
var options = {
method: 'POST',
url: 'http://localhost/job/JobName/buildWithParameters?delay=0sec&Jenkins-Crumb=asdf345672das',
auth : {
username : jenkins.username,
password : jenkins.password
},
headers:
{
'content-type': 'multipart/form-data; boundary=----WebKitFormBoundary7MA4YWxkTrZu0gW'
},
body:
{
'fileParameterName' :
{ value: fs.createReadStream(absoluteFilePath), options: { filename: FileName, contentType: null } },
'param': 'value'
}
};
There is a plain JS way of doing this, with just popular libraries such as Axios and form-data
var axios = require('axios');
var FormData = require('form-data');
var fs = require('fs');
var data = new FormData();
const jenkinsUrl = 'http://example_jenkins_server_url.com/job/<JOB_NAME>/build';
// notice the URL has build not buildWithParameters if the job has file parameters. ^
const userName = 'example_user';
const password = 'example_pass'
const params = {"parameter": [
// file0 here is the field name we are appending to data object,
// this informs Jenkins which file maps to which job parameter, so even multiple file uploads can be done using this approach!
{"name":"<Name Of file parameter in Job>", "file":"file0"}
// incase you have additional string parameters you need to pass add it here.
{"name": "StringParam1", "value": "value"}
]}
data.append('file0', fs.createReadStream(<full file path>));
data.append('json', JSON.stringify(params));
var config = {
method: 'post',
url: jenkinsUrl,
headers: {
Authorization: `Basic ${Buffer.from(`${userName}:${password}`).toString('base64')}`,
...data.getHeaders()
},
data : data
};
axios(config)
.then(function (response) {
console.log(JSON.stringify(response.data));
})
A web page (front) is calling a service which send a PDF stream as a response :
Here is the front code :
'click .btn': function (event) {
/.../
event.preventDefault();
Http.call(params, (err, res) => { // callback
if (err) console.log(err); // nothing
console.log({ res }); // print below result
const blob = new Blob(
[res.content],
{ type: `${res.headers['content-type']};base64` }
);
saveAs(blob, res.headers['content-disposition'].slice(21));
});
}
Here is the response from the server ( console.log(res) ) : { res : Object } printed in the console.
content: "%PDF-1.4↵1 0 obj↵<<↵/Title (��)↵/Creator (��)↵/Prod ..... lot of characters....%"
data: null,
statusCode: 200,
headers: {
connection: "close",
content-disposition: "attachment; filename=myDoc.pdf"
content-type: "application/pdf",
date: "date",
transfer-encoding: "chunked",
x-powered-by: "Express"
}
However, the PDF is downloaded with no content, it's full blank like corrupted ( But I can see the content in the string ). It works well with the CSV routes ( I send a csv as a stream and download it with the same method and I got the data).
I think there is something with the format %PDF ...% but I didn't manage to find something.
Note : With postman, it works, my PDF is saved, the page is not blank, I got the data. So there is something in the front I am not doing right.
I also tried with :
const fileURL = URL.createObjectURL(blob);
window.open(fileURL); // instead of saveAs
but the result is the same ( but in another tab instead of saved PDF ) blank page.
Any ideas ?
You probably forgot to specify the response type in your inital backend call - from the example you posted "arraybuffer" would be the correct one here, you can check all types here.
I'm trying to load a .json file into a variable in javascript, but I can't get it to work. It's probably just a minor error but I can't find it.
Everything works just fine when I use static data like this:
var json = {
id: "whatever",
name: "start",
children: [{
"id": "0.9685",
"name": " contents:queue"
}, {
"id": "0.79281",
"name": " contents:mqq_error"
}
}]
}
I put everything that's in the {} in a content.json file and tried to load that into a local JavaScript variable as explained here: load json into variable.
var json = (function() {
var json = null;
$.ajax({
'async': false,
'global': false,
'url': "/content.json",
'dataType': "json",
'success': function(data) {
json = data;
}
});
return json;
})();
I ran it with the Chrome debugger and it always tells me that the value of the variable json is null. The content.json file resides in the same directory as the .js file that calls it.
What did I miss?
My solution, as answered here, is to use:
var json = require('./data.json'); //with path
The file is loaded only once, further requests use cache.
edit To avoid caching, here's the helper function from this blogpost given in the comments, using the fs module:
var readJson = (path, cb) => {
fs.readFile(require.resolve(path), (err, data) => {
if (err)
cb(err)
else
cb(null, JSON.parse(data))
})
}
For ES6/ES2015 you can import directly like:
// example.json
{
"name": "testing"
}
// ES6/ES2015
// app.js
import * as data from './example.json';
const {name} = data;
console.log(name); // output 'testing'
If you use Typescript, you may declare json module like:
// tying.d.ts
declare module "*.json" {
const value: any;
export default value;
}
Since Typescript 2.9+ you can add --resolveJsonModule compilerOptions in tsconfig.json
{
"compilerOptions": {
"target": "es5",
...
"resolveJsonModule": true,
...
},
...
}
If you pasted your object into content.json directly, it is invalid JSON. JSON keys and values must be wrapped in double quotes (" not ') unless the value is numeric, boolean, null, or composite (array or object). JSON cannot contain functions or undefined values. Below is your object as valid JSON.
{
"id": "whatever",
"name": "start",
"children": [
{
"id": "0.9685",
"name": " contents:queue"
},
{
"id": "0.79281",
"name": " contents:mqq_error"
}
]
}
You also had an extra }.
A solution without require or fs:
var json = []
fetch('./content.json').then(response => json = response.json())
The built-in node.js module fs will do it either asynchronously or synchronously depending on your needs.
You can load it using var fs = require('fs');
Asynchronous
fs.readFile('./content.json', (err, data) => {
if (err)
console.log(err);
else {
var json = JSON.parse(data);
//your code using json object
}
})
Synchronous
var json = JSON.parse(fs.readFileSync('./content.json').toString());
There are two possible problems:
AJAX is asynchronous, so json will be undefined when you return from the outer function. When the file has been loaded, the callback function will set json to some value but at that time, nobody cares anymore.
I see that you tried to fix this with 'async': false. To check whether this works, add this line to the code and check your browser's console:
console.log(['json', json]);
The path might be wrong. Use the same path that you used to load your script in the HTML document. So if your script is js/script.js, use js/content.json
Some browsers can show you which URLs they tried to access and how that went (success/error codes, HTML headers, etc). Check your browser's development tools to see what happens.
For the given json format as in file ~/my-app/src/db/abc.json:
[
{
"name":"Ankit",
"id":1
},
{
"name":"Aditi",
"id":2
},
{
"name":"Avani",
"id":3
}
]
inorder to import to .js file like ~/my-app/src/app.js:
const json = require("./db/abc.json");
class Arena extends React.Component{
render(){
return(
json.map((user)=>
{
return(
<div>{user.name}</div>
)
}
)
}
);
}
}
export default Arena;
Output:
Ankit Aditi Avani
for free JSON files to work with go to https://jsonplaceholder.typicode.com/
and to import your JSON files try this
const dataframe1=require('./users.json');
console.log(dataframe1);
Answer from future.
In 2022, we have import assertions api for import json file in js file.
import myjson from "./myjson.json" assert { type: "json" };
console.log(myjson);
Browser support: till september 2022, only chromium based browsers and safari supported.
Read more at: v8 import assertions post
To export a specific value from output.json (containing json shared on question) file to a variable say VAR :
export VAR=$(jq -r '.children.id' output.json)