I need to GET data from an API that uses square brackets as part of the parameter name. I didn't write the API, so don't shoot the messenger!
Edit: I should have noted, this code will run on node (server-side), not in the browser.
I'm using Axios in Javascript, and this is my axios call:
axios.get(url, {params: {queryParams}})
.then(res => {
brands = res.data;
console.log(res.data);
})
.catch(error => {
console.log( '\n\n\n\n')
console.log(error);
});
The params are as follows. For brevity, I'm showing the three different formats I've tried (direct character, escaped and ASCII encoded), but in each attempt, I've passed the three parameters with the same format.
Set the query parameters
let queryParams = {
"tables": table,
"manifest": manifest,
"where[0][0]": field,
"where%5B0%5D%5B1%5D": "%3D",
"where\\[0\\]\\[2\\]": searchValue,
"ordery_by": "id%2C%20ASC",
"limit": "100",
"app": "json",
'client_key': authkey
}
In all cases, axios seems to transform the parameters into a javascript web token.
If, on the other hand, I concatenate the parameters to the URL as a string, the request works, and I get the data I expected.
let fullPath = url.concat(
"?tables=", table,
"&manifest=", manifest,
"&where%5B0%5D%5B0%5D=", field,
"&where%5B0%5D%5B1%5D=", "%3D",
"&where%5B0%5D%5B2%5D=", searchValue,
"&ordery_by=", "id%2C%20ASC",
"&limit=", "100",
"&app=", "json",
"&client_key=", authkey
)
While I have a workaround solution (as shown above), is there a way doing this with a proper parameters object?
If you are doing this in browser you could use the URLSearchParams() to iterate a human readable object and have it create the query string.
There is also a similar module available for Node
axios supports passing a URLSearchParams object as params argument also
let queryParams = {
"tables": 1,
"manifest": 2,
"where[0][0]": 3,
"where[0][1]": "=",
"where[0][2]": 4,
"ordery_by": "id,ASC",
"limit": "100",
"app": "json",
'client_key': 'abc'
}
const sParams = new URLSearchParams(Object.entries(queryParams));
console.log('query string')
console.log(sParams.toString())
console.log('sParam entries')
console.log(JSON.stringify([...sParams]))
.as-console-wrapper { max-height: 100%!important;top:0;}
Going a step further you can construct the full url with URL constructor
const url = new URL('https://myApi.com')
url.search = new URLSearchParams(Object.entries(queryParams));
console.log(url.href)
Related
I have an object which returns,
[
{
name:"getSpeed",
params:["distance","time"]
},
{
name:"getTime",
params:["speed","distance"]
},
...
]
This object is subject to change as it is gathered from an embedded device.
im trying to convert this into an object with callable functions i.e.
let myObj = {
getSpeed: function(distance, time){
/* do something (this is irrelevant) */
},
getTime: function(speed, distance){
/* do something (again not relevant) */
}
}
is there any way to map an array of strings to function parameters when mapping over an array?
According to your comments it appears you want to create functions from this definition which send commands with a function-call-like string which is evaled on the other side, and you don't actually care about the paramater names but rather about the correct number of parameters.
I would therefore recommend something like this:
const myObj = Object.fromEntries(data.map(({ name, params }) => [
name,
(...args) => {
if (args.length !== params.length) {
throw new TypeError(`${name} expected ${params.length} arguments, got ${args.length}`)
}
this.UART.write(`${name}(${args.map(arg => JSON.stringify(arg)).join(',')})`)
}
]))
This will work with all the datatypes that JSON supports, and will as a side effect also pass undefined as argument correctly.
Here is a runnable example with console.log instead of this.UART.write:
const data = [
{
name: "getSpeed",
params: ["distance", "time"]
},
{
name: "getTime",
params: ["speed", "distance"]
}
]
const myObj = Object.fromEntries(data.map(({ name, params }) => [
name,
(...args) => {
if (args.length !== params.length) {
throw new TypeError(`${name} expected ${params.length} arguments, got ${args.length}`)
}
console.log(`${name}(${args.map(arg => JSON.stringify(arg)).join(',')})`)
}
]))
myObj.getSpeed(123, 456) // prints `getSpeed(123,456)`
myObj.getTime(123, 456) // prints `getTime(123,456)`
myObj.getTime("hello", true) // prints `getTime("hello",true)`
myObj.getTime(1) // throws `getTime expected 2 arguments, got 1`
However, as you said yourself, the whole eval business is not ideal anyway. I would recommend - if possible - to reconsider the protocol to use something more secure and robust like gRPC or, one layer below, protocol buffers. Given that you are using JavaScript on both ends, JSON-RPC could also be a nice solution.
Here is an API call, which needs some values to return a specific set of products,
issue
"category_slug" has to be an array but for some reason, the API says it's not. what is the problem here?
const url = new URL(
"https://someApi/products"
);
let params = {
"limit": "10",
"page": "1",
"category_slug": ["shoes"]
};
// "limit" is "how many products in on one page".
// "page" is the fetched page with the specific "limited" products.
//issue
// "category_slug" is a specific products category, this has to be an array but for
// some reason, the API says it's not. what is the problem here?
Object.keys(params)
.forEach(key => url.searchParams.append(key, params[key]));
//here I'm appending the specific value in {params} to the URL.
let headers = {
"Accept": "application/json",
"Content-Type": "application/json",
};
fetch(url, {
method: "GET",
headers: headers,
})
.then(response => response.json())
.then(json => console.log(json));
See, you expect too little and too much at the same time of such a beautiful thing as URLSearchParams.
Too little, because usually you can just pass the whole params object into its constructor without wasting time on keys, forEach, etc...
const url = new URL('https://example.com/');
const params = {
limit: 10,
page: 1
};
url.search = new URLSearchParams(params); // yes, that easy
console.log(url.toString());
// https://example.com/?limit=10&page=1
Too much, because URLSearchParams is not designed to work with arrays. When appended element is an array, it's just stringified:
const url = new URL('https://example.com/');
const params = {
slug: [1, 2, 3]
};
url.search = new URLSearchParams(params);
console.log(url); // https://example.com/?slug=1%2C2%2C3
In this case, slug param got 1,2,3 (result of [1, 2, 3].toString()) assigned to it (and all the commas were urlencoded - replaced by %2C sequence).
Your API might actually work with this, but there's a huge chance it expects array args to be passed in the following format:
https://example.com/?slug=1&slug=2&slug=3
... yet even this might not work, if API expects array args to be passed with [] appended to each key, like this:
https://example.com/?slug[]=1&slug[]=2&slug[]=3
So you'll have to check your API (it's hard to debug such things just by looking into crystal ball, you know...), take its flavor into account - and process your items separately. For example,
const url = new URL('https://example.com/');
const params = {
limit: 10,
page: 1
};
url.search = new URLSearchParams(params);
const slugs = [1, 2, 3];
url.search += ['', ...slugs.map(
slug => `category_slug[]=${encodeURIComponent(slug)}`)].join('&');
console.log(url.toString());
I'm having some trouble passing into a variable that holds a json object into sendgrid's dynamic_template_data. My setup looks like this:
const send = async (address, mentions) => {
console.log('mentions json obj', mentions)
let name = "john"
try {
let config = {
headers: {
Authorization: `Bearer ${process.env.sendgridKey}`,
}
}
let data = {
personalizations: [
{
to: [
{
email: `${address}`,
},
],
dynamic_template_data: {
name: name,
allMentions: mentions
}
}
],
from: {
email: "email#email.com",
name: "Mentionscrawler Team"
},
template_id: process.env.template_id,
}
await axios.post("https://api.sendgrid.com/v3/mail/send", data, config)
} catch (error) {
console.error(error, 'failing here>>>>>>>')
}
}
when I console.log mentions, which is json, and paste the code I get from the terminal directly into the allMentions key, it works. but when I just pass in mentions itself, nothing shows up on the sent email. I've been very confused the last few hours why this is happening. Any advice appreciated.
edit: i should also note that allmentions is an object with keys that hold arrays. So I'm looking to iterate over those arrays. Again, this totally all works if I just paste in directly what mentions is, but passing in mentions is giving me an issue.
Thank you very much,
just realized what was wrong. sendgrid's template requires a json object, so I assumed that I needed to use json.stringify on my mentions obj. Turns out I didn't need to do that, as long as all values are in string format.
I have the below json response after running a postMan test of a Rest API:
{
"glossary": {
"title": "example glossary",
"GlossDiv": {
"title": "S",
"GlossList": {
"GlossEntry": {
"ID": "SGML",
"SortAs": "SGML",
"GlossTerm": "Standard Generalized Markup Language",
"Acronym": "SGML",
"Abbrev": "ISO 8879:1986",
"GlossDef": {
"para": "A meta-markup language, used to create markup languages such as DocBook.",
"GlossSeeAlso": ["GML", "XML"]
},
"GlossSee": "markup"
}
}
}
}
}
Now I would like to compare the above json against a predefined json. Say, its the same as above.
How can I compare two jsons via the Postman test?
I had a similar problem to solve except that my JSON also contained an array of objects. I used the following technique that can be modified to deal with the simple array of strings in your question.I created an array of global functions called "assert", which contained helper functions such as "areEqual" and "areArraysOfObjectsEqual" and saved these under the "Tests" tab at a top folder level of my tests.
assert = {
areEqual: (actual, expected, objectName) => {
pm.test(`Actual ${objectName} '` + actual + `' matches Expected ${objectName} '` + expected + `'`, () => {
pm.expect(_.isEqual(actual, expected)).to.be.true;
});
},
areArraysOfObjectsEqual: (actual, expected, objectName) => {
if (!_.isEqual(actual, expected)) {
// Arrays are not equal so report what the differences are
for (var indexItem = 0; indexItem < expected.length; indexItem++) {
assert.compareArrayObject(actual[indexItem], expected[indexItem], objectName);
}
}
else
{
// This fake test will always pass and is just here for displaying output to highlight that the array has been verified as part of the test run
pm.test(`actual '${objectName}' array matches expected '${objectName}' array`);
}
},
compareArrayObject: (actualObject, expectedObject, objectName) => {
for (var key in expectedObject) {
if (expectedObject.hasOwnProperty(key)) {
assert.areEqual(expectedObject[key], actualObject[key], objectName + " - " + key);
}
}
}
};
Your "Pre-request Script" for a test would set your expected object
const expectedResponse =
{
"id": "3726b0d7-b449-4088-8dd0-74ece139f2bf",
"array": [
{
"item": "ABC",
"value": 1
},
{
"item": "XYZ",
"value": 2
}
]
};
pm.globals.set("expectedResponse", expectedResponse);
Your Test would test each item individually or at the array level like so:
const actualResponse = JSON.parse(responseBody);
const expectedResponse = pm.globals.get("expectedResponse");
assert.areEqual(
actualResponse.id,
expectedResponse.id,
"id");
assert.areArraysOfObjectsEqual(
actualResponse.myArray,
expectedResponse.myArray,
"myArrayName");
This technique will give nice "property name actual value matches expected value" output and works with arrays of objects being part of the JSON being compared.
Update:
To test your array of strings "GlossSeeAlso", simply call the supplied global helper method in any of your tests like so:
assert.compareArrayObject(
actualResponse.glossary.GlossDiv.GlossList.GlossEntry.GlossDef.GlossSeeAlso,
expectedResponse.glossary.GlossDiv.GlossList.GlossEntry.GlossDef.GlossSeeAlso,
"glossary.GlossDiv.GlossList.GlossEntry.GlossDef.GlossSeeAlso");
Primitive types in JSON key value pairs can be tested like so:
assert.areEqual(
actualResponse.glossary.title,
expectedResponse.glossary.title,
"glossary.title");
I got it after a while. Add test into your request and use Runner to run all your requests in the collection.
Postman info: Version 7.10.0 for Mac.
Test scripts:
pm.test("Your test name", function () {
var jsonData = pm.response.json();
pm.expect(jsonData).to.eql({
"key1": "value1",
"key2": 100
});
});
You can paste this code into your collection or single request tests tab.
What this code do is to save the request into a global variable with a key for that request. You can change your enviroment and hit the same request and if the response are different the test will fail.
const responseKey = [pm.info.requestName, 'response'].join('/');
let res = '';
try {
res = JSON.stringify(pm.response.json());
} catch(e) {
res = pm.response.text();
}
if (!pm.globals.has(responseKey)) {
pm.globals.set(responseKey, res);
} else {
pm.test(responseKey, function () {
const response = pm.globals.get(responseKey);
pm.globals.unset(responseKey);
try {
const data = pm.response.json();
pm.expect(JSON.stringify(data)).to.eql(response);
} catch(e) {
const data = pm.response.text();
pm.expect(data).to.eql(response);
}
});
}
Hope this help.
You can write javascript code inside Tests tab of Postman. Just write simple code to compare and check result in Tests.
var serverData = JSON.parse(responseBody);
var JSONtoCompare = {}; //set your predefined JSON here.
tests["Body is correct"] = serverData === JSONtoCompare;
Looks like the same question asked at POSTMAN: Comparing object Environment variable with response's object which also lists a solution that works, which is to use JSON.stringify() to turn the objects into strings and then compare the strings.
Came across this issue when migrating from a legacy API to a new one and wanting to assert the new API is exactly the same as the old under different scenarios
For context this clones params of the original get request to the legacy endpoint and validates both match up
LEGACY_API_URL should be defined in the environment and the Request is going to the new API
const { Url } = require('postman-collection');
// Setup the URL for the Legacy API
const legacyRequestUrl = new Url({ host: pm.variables.replaceIn("http://{{LEGACY_API_HOST}}/blah")});
// Add All Parameters From the Source Query
legacyRequestUrl.addQueryParams(pm.request.url.query.all());
// Log out the URL For Debugging Purposes
console.log("URL", legacyRequestUrl.toString());
pm.sendRequest(legacyRequestUrl.toString(), function (err, response) {
pm.test('New API Response Matches Legacy API Response', function () {
// Log Out Responses for Debugging Purposes
console.log("New API Response", pm.response.json())
console.log("Legacy API Response", response.json())
// Assert Both Responses are Equal
pm.expect(_.isEqual(pm.response.json(), response.json())).to.be.true
});
});
Link to an example collection
https://www.getpostman.com/collections/4ff9953237c0ab1bce99
Write JavaScript code under 'Tests' section. Refer below link for more info.
Click Here
I am making a POST request:
var offers = $resource('/api/offers/:id', {
id: '#id'
}
offers.save({}, { name: $scope.newOfferName }, function (offerId) {
$location.path('/offers/' + offerId);
});
I expect offerId to be a string "key", but instead I get an array [0] "k", [1] "e", [2] "y".
On the backend I use Nancy and I return the response using:
Post["/"] = _ =>
{
return Response.AsText("key");
};
The response Header say Content-Type:text/plain and in Chrome preview (Network tab) I can see "key".
When I return an object as JSON it's working fine, but I don't want to create fake class (having a string field) just to pass a string to the client.
I assume Nancy is fine here. What is happening with Angular?
You don't have to create a class for that. You can use an anonymous type:
Post["/"] = _ =>
{
return Response.AsJson(new { offerId = "key" });
};
angular has a default transform that tries to parse incoming data as json.
https://github.com/angular/angular.js/blob/master/src/ng/http.js#L94
You could remove that transform from the transformers array in $http completely, or replace it with one that checks the content-type before trying to transform the data.