Say if I have small function that takes a Request object as an argument, and calls the fetch() API.
Except I always want to append something to the url, such as ?foo=bar. I'm curious what the best way would be to go about that.
Example:
function fetchFoo(request) {
request.url += '?foo=bar';
return fetch(request);
}
The issue I have is that this won't work. The Fetch API specification states that the url property read-only.
Is there a way to work around this? I'm thinking I might need to construct an all-new Request object, but I'm unsure what a clever way is to inherit all the options from the previous Request object.
Note that I am able to override any other property by using this syntax:
var originalRequest = new Request('/url');
var overriddenRequest = new Request(originalRequest, { method: 'POST' });
Although it wasn't clear from the docs, it seems that the second init parameter takes precedence over the values passed via the originalRequest parameter. I just can't seem to come up with a way to do this for the url as well.
You could leverage the keys that are on the Request.prototype to build a new Request object in just a few lines.
function newRequest(input, init) {
var url = input;
if (input instanceof Request) {
url = mutateUrl(input.url);
init = init || {};
Object.keys(Request.prototype).forEach(function (value) {
init[value] = input[value];
});
delete init.url;
return input.blob().then(function (blob) {
if (input.method.toUpperCase() !== 'HEAD' && input.method.toUpperCase() !== 'GET' && blob.size > 0) {
init.body = blob;
}
return new Request(url, init);
})
} else {
url = mutateUrl(url);
}
return new Request(url, init);
}
Note the special case for body discussed in this answer.
Related
I'm trying to use the new Fetch API:
I am making a GET request like this:
var request = new Request({
url: 'http://myapi.com/orders',
method: 'GET'
});
fetch(request);
However, I'm unsure how to add a query string to the GET request. Ideally, I want to be able to make a GET request to a URL like:
'http://myapi.com/orders?order_id=1'
In jQuery I could do this by passing {order_id: 1} as the data parameter of $.ajax(). Is there an equivalent way to do that with the new Fetch API?
A concise, modern approach:
fetch('https://example.com?' + new URLSearchParams({
foo: 'value',
bar: 2,
}))
How it works: When a string (e.g. the URL) is being concatenated with an instance of URLSearchParams, its toString() method will automatically be called to convert the instance into a string representation, which happens to be a properly encoded query string. If the automatic invoking of toString() is too magical for your liking, you may prefer to explicitly call it like so: fetch('https://...' + new URLSearchParams(...).toString())
A complete example of a fetch request with query parameters:
// Real example you can copy-paste and play with.
// jsonplaceholder.typicode.com provides a dummy rest-api
// for this sort of purpose.
async function doAsyncTask() {
const url = (
'https://jsonplaceholder.typicode.com/comments?' +
new URLSearchParams({ postId: 1 }).toString()
);
const result = await fetch(url)
.then(response => response.json());
console.log('Fetched from: ' + url);
console.log(result);
}
doAsyncTask();
If you are using/supporting...
IE: Internet Explorer does not provide native support for URLSearchParams or fetch, but there are polyfills available.
Node: As of Node 18 there is native support for the fetch API (in version 17.5 it was behind the --experimental-fetch flag). In older versions, you can add the fetch API through a package like node-fetch. URLSearchParams comes with Node, and can be found as a global object since version 10. In older version you can find it at require('url').URLSearchParams.
Node + TypeScript: If you're using Node and TypeScript together you'll find that, due to some technical limitations, TypeScript does not offer type definitions for the global URLSearchParams. The simplest workaround is to just import it from the url module. See here for more info.
Update March 2017:
URL.searchParams support has officially landed in Chrome 51, but other browsers still require a polyfill.
The official way to work with query parameters is just to add them onto the URL. From the spec, this is an example:
var url = new URL("https://geo.example.org/api"),
params = {lat:35.696233, long:139.570431}
Object.keys(params).forEach(key => url.searchParams.append(key, params[key]))
fetch(url).then(/* … */)
However, I'm not sure Chrome supports the searchParams property of a URL (at the time of writing) so you might want to either use a third party library or roll-your-own solution.
Update April 2018:
With the use of URLSearchParams constructor you could assign a 2D array or a object and just assign that to the url.search instead of looping over all keys and append them
var url = new URL('https://sl.se')
var params = {lat:35.696233, long:139.570431} // or:
var params = [['lat', '35.696233'], ['long', '139.570431']]
url.search = new URLSearchParams(params).toString();
fetch(url)
Sidenote: URLSearchParams is also available in NodeJS
const { URL, URLSearchParams } = require('url');
let params = {
"param1": "value1",
"param2": "value2"
};
let query = Object.keys(params)
.map(k => encodeURIComponent(k) + '=' + encodeURIComponent(params[k]))
.join('&');
let url = 'https://example.com/search?' + query;
fetch(url)
.then(data => data.text())
.then((text) => {
console.log('request succeeded with JSON response', text)
}).catch(function (error) {
console.log('request failed', error)
});
As already answered, this is per spec not possible with the fetch-API, yet. But I have to note:
If you are on node, there's the querystring package. It can stringify/parse objects/querystrings:
var querystring = require('querystring')
var data = { key: 'value' }
querystring.stringify(data) // => 'key=value'
...then just append it to the url to request.
However, the problem with the above is, that you always have to prepend a question mark (?). So, another way is to use the parse method from nodes url package and do it as follows:
var url = require('url')
var data = { key: 'value' }
url.format({ query: data }) // => '?key=value'
See query at https://nodejs.org/api/url.html#url_url_format_urlobj
This is possible, as it does internally just this:
search = obj.search || (
obj.query && ('?' + (
typeof(obj.query) === 'object' ?
querystring.stringify(obj.query) :
String(obj.query)
))
) || ''
You can use stringify from query-string.
import { stringify } from 'query-string';
fetch(`https://example.org?${stringify(params)}`)
encodeQueryString — encode an object as querystring parameters
/**
* Encode an object as url query string parameters
* - includes the leading "?" prefix
* - example input — {key: "value", alpha: "beta"}
* - example output — output "?key=value&alpha=beta"
* - returns empty string when given an empty object
*/
function encodeQueryString(params) {
const keys = Object.keys(params)
return keys.length
? "?" + keys
.map(key => encodeURIComponent(key)
+ "=" + encodeURIComponent(params[key]))
.join("&")
: ""
}
encodeQueryString({key: "value", alpha: "beta"})
//> "?key=value&alpha=beta"
I know this is stating the absolute obvious, but I feel it's worth adding this as an answer as it's the simplest of all:
const orderId = 1;
fetch('http://myapi.com/orders?order_id=' + orderId);
Maybe this is better:
const withQuery = require('with-query');
fetch(withQuery('https://api.github.com/search/repositories', {
q: 'query',
sort: 'stars',
order: 'asc',
}))
.then(res => res.json())
.then((json) => {
console.info(json);
})
.catch((err) => {
console.error(err);
});
Solution without external packages
to perform a GET request using the fetch api I worked on this solution that doesn't require the installation of packages.
this is an example of a call to the google's map api
// encode to scape spaces
const esc = encodeURIComponent;
const url = 'https://maps.googleapis.com/maps/api/geocode/json?';
const params = {
key: "asdkfñlaskdGE",
address: "evergreen avenue",
city: "New York"
};
// this line takes the params object and builds the query string
const query = Object.keys(params).map(k => `${esc(k)}=${esc(params[k])}`).join('&')
const res = await fetch(url+query);
const googleResponse = await res.json()
feel free to copy this code and paste it on the console to see how it works!!
the generated url is something like:
https://maps.googleapis.com/maps/api/geocode/json?key=asdkf%C3%B1laskdGE&address=evergreen%20avenue&city=New%20York
this is what I was looking before I decided to write this, enjoy :D
Template literals are also a valid option here, and provide a few benefits.
You can include raw strings, numbers, boolean values, etc:
let request = new Request(`https://example.com/?name=${'Patrick'}&number=${1}`);
You can include variables:
let request = new Request(`https://example.com/?name=${nameParam}`);
You can include logic and functions:
let request = new Request(`https://example.com/?name=${nameParam !== undefined ? nameParam : getDefaultName() }`);
As far as structuring the data of a larger query string, I like using an array concatenated to a string. I find it easier to understand than some of the other methods:
let queryString = [
`param1=${getParam(1)}`,
`param2=${getParam(2)}`,
`param3=${getParam(3)}`,
].join('&');
let request = new Request(`https://example.com/?${queryString}`, {
method: 'GET'
});
Was just working with Nativescript's fetchModule and figured out my own solution using string manipulation.
Append the query string bit by bit to the url. Here is an example where query is passed as a json object (query = {order_id: 1}):
function performGetHttpRequest(fetchLink='http://myapi.com/orders', query=null) {
if(query) {
fetchLink += '?';
let count = 0;
const queryLength = Object.keys(query).length;
for(let key in query) {
fetchLink += key+'='+query[key];
fetchLink += (count < queryLength) ? '&' : '';
count++;
}
}
// link becomes: 'http://myapi.com/orders?order_id=1'
// Then, use fetch as in MDN and simply pass this fetchLink as the url.
}
I tested this over a multiple number of query parameters and it worked like a charm :)
Hope this helps someone.
var paramsdate=01+'%s'+12+'%s'+2012+'%s';
request.get("https://www.exampleurl.com?fromDate="+paramsDate;
I'm using express and request to turn a site's html into json, then returning it. For example:
app.get('/live', function(req,_res){
res = _res;
options.url = 'http://targetsite.com';
request(options,parseLive);
});
function parseLive(err, resp, html) {
var ret = {status:'ok'};
-- error checking and parsing of html --
res.send(ret);
}
Currently I'm using a global var res to keep track of the return call, but this fails when multiple requests are made at the same time. So, I need some way of matching return calls from express to their callbacks in request.
How might I do this?
Use a closure.
Pass the variable to a function. Return the function you want to pass to request from that function.
app.get('/live', function(req,_res){
options.url = 'http://targetsite.com';
request(options,parseLiveFactory(res));
});
function parseLiveFactory(res) {
function parseLive(err, resp, html) {
var ret = {status:'ok'};
-- error checking and parsing of html --
res.send(ret);
}
return parseLive;
}
We are trying to modify an existing script which uses backbone.js to fetch JSON from a URL and render it in a defined way on screen.
Earlier the script was pointing to an external PHP file to fetch the JSON from it.
url: function () {
var ajaxValue = document.getElementById('ajax').value;
if(ajaxValue==0){
return this.options.apiBase + '/liveEvents.json';
} else {
var eventDate = document.getElementById('timestamp').value;
return this.options.apiBase + '/ajax.php?eventDate='+eventDate;
}
},
But now we are trying to omit the requirement of PHP and get JSON purely using Javascript. For this, we created a JS function fetch_data_set(), that returns proper JSON
var ArrayMerge = array1.concat(array2,array3,array4);
return JSON.stringify(ArrayMerge);
So our question is, how can we feed this JSON to backbone instead of using an external URL. Because if we do this (which is obviously wrong):
url: function () {
var ajaxValue = document.getElementById('ajax').value;
if(ajaxValue==0){
var data_set = fetch_data_set();
return data_set;
}
},
It throws error: Error: A "url" property or function must be specified
The main key is to extend Backbone.sync instead of url() method, so you could use this way to fetch your models in any kind of model, and you could do something similar like this link:
https://github.com/huffingtonpost/backbone-fixtures/blob/master/backbone-fixtures.js
Backbone.Model contains a sync() function able to load JSON data from an url. sync() uses the url() function to determine from where it should fetch data. (Note : sync() is called under-the-hood by save(), fetch() and destroy())
The trick here is that you should stop overriding url() and reimplement sync() directly instead, cf. http://backbonejs.org/#Model-sync
Here is an example :
// specialized version to be used with a store.js - like object
sync: function(method, model, options) {
console.log("sync_to_store begin('"+method+"',...) called with ", arguments);
var when_deferred = when.defer();
var id = this.url();
if(method === "read") {
if(typeof id === 'undefined')
throw new Error("can't fetch without id !");
var data = model.store_.get(id);
// apply fetched data
model.set(data);
when_deferred.resolve( [model, undefined, options] );
}
else if(method === "create") {
// use Backbone id as server id
model.id = model.cid;
model.store_.set(id, model.attributes);
when_deferred.resolve( [model, undefined, options] );
}
else if(method === "update") {
if(typeof id === 'undefined')
throw new Error("can't update without id !");
model.store_.set(id, model.attributes);
when_deferred.resolve( [model, undefined, options] );
}
else if(method === "delete") {
if(typeof id === 'undefined')
throw new Error("can't delete without id !");
model.store_.set(id, undefined);
model.id = undefined;
when_deferred.resolve( [model, undefined, options] );
}
else {
// WAT ?
}
console.log("sync_to_store end - Current changes = ", model.changed_attributes());
return when_deferred.promise;
}
Note 1 : API is slightly different from vanilla Backbone since I return
a when promise
Note 2 : url() is still used, as an id
For mapping my backbone models to what I get from the server I am using a technique described on the GroupOn Dev blog: https://engineering.groupon.com/2012/javascript/extending-backbone-js-to-map-rough-api-responses-into-beautiful-client-side-models/
However, this only maps incoming data to the model.
I would like this to go both ways, so that when I save a model, it prepares the models attributes to match the servers model.
What would be the best solution to prepare the output of the model?
I've run into this exact same issue where my server response is completely different from what I am able to post. I discovered within the mechanics of the Backbone.sync object a way to that I could post to my server a custom JSON object in the following statement in Backbone.sync:
if (!options.data && model && (method == 'create' || method == 'update')) {
params.contentType = 'application/json';
params.data = JSON.stringify(model.toJSON());
}
sync evaluates if options.data does not exist then sets the params.data to the stringified model. The options.data check keyed me off. If that exists, sync will use that instead of the model. So given this, I overrode my model.save so could pass in an attributes hash that my server expects.
Here's how I overrode it:
save : function(key, value, options) {
var attributes = {}, opts = {};
//Need to use the same conditional that Backbone is using
//in its default save so that attributes and options
//are properly passed on to the prototype
if (_.isObject(key) || key == null) {
attributes = key;
opts = value;
} else {
attributes = {};
attributes[key] = value;
opts = options;
}
//In order to set .data to be used by Backbone.sync
//both opts and attributes must be defined
if (opts && attributes) {
opts.data = JSON.stringify(attributes);
opts.contentType = "application/json";
}
//Finally, make a call to the default save now that we've
//got all the details worked out.
return Backbone.Model.prototype.save.call(this, attributes, opts);
}
So how do you use this in your case? Essentially what you'll do is create a method that reverses the mapping and returns the resulting JSON. Then you can invoke save from your view or controller as follows:
getReversedMapping : function() {
ver reversedMap = {};
...
return reversedMap;
},
saveToServer : function() {
this._model.save(this.getReverseMapping, {
success : function(model, response) {
...
},
error : function(model, response) {
...
}
})
}
Since your overridden save automatically copies the JSON you pass in to options.data, Backbone.sync will use it to post.
The answer by Brendan Delumpa works, but it over-complicates things.
Don't do this in your save method. You don't want to copy over these parameter checks each time (and what if they somehow change in Backbone?).
Instead, overwrite the sync method in your model like this:
var MyModel = Backbone.Model.extend({
...,
sync: function (method, model, options) {
if (method === 'create' || method === 'update') {
// get data from model, manipulate and store in "data" variable
// ...
options.data = JSON.stringify(data);
options.contentType = 'application/json';
}
return Backbone.Model.prototype.sync.apply(this, arguments);
}
});
That's all there is to it when you need to "prepare" the data in a server-ready format.
After watching RailsCast #296 about Mercury Editor, I am trying to get the editor to redirect to a newly created resource.
I can already redirect on the client-side using JS and window.location.href=. But for a new resource, I cannot "guess" its URL on the client-side. I need it to be in the server response.
However, the problem is that I don't see the possibility of using the server response in the editor. No matter what the controller renders, the server response is discarded by Mercury instead of used as an argument to my function for mercury:saved.
Is there a way to get around this?
I was able to do this on update by sending a valid JSON string back. I would assume create works the same way. check firebug to make sure you're not getting an error in the jQuery.ajax call that Mercury uses.
posts_controller.rb
def mercury_update
post = Post.find(params[:id])
post.title = params[:content][:post_title][:value]
post.body = params[:content][:post_body][:value]
post.save!
render text: '{"url":"'+ post_path(post.slug) +'"}'
end
mercury.js:
jQuery(window).on('mercury:ready', function() {
Mercury.on('saved', function() {
window.location.href = arguments[1].url
});
});
note: I'm using friendly_id to slug my posts
Redirecting on the server side doesn't work because the save button is just an jQuery.ajax call:
// page_editor.js
PageEditor.prototype.save = function(callback) {
var data, method, options, url, _ref, _ref1,
_this = this;
url = (_ref = (_ref1 = this.saveUrl) != null ? _ref1 : Mercury.saveUrl) != null ? _ref : this.iframeSrc();
data = this.serialize();
data = {
content: data
};
if (this.options.saveMethod === 'POST') {
method = 'POST';
} else {
method = 'PUT';
data['_method'] = method;
}
Mercury.log('saving', data);
options = {
headers: Mercury.ajaxHeaders(),
type: method,
dataType: this.options.saveDataType,
data: data,
success: function(response) {
Mercury.changes = false;
Mercury.trigger('saved', response);
if (typeof callback === 'function') {
return callback();
}
},
error: function(response) {
Mercury.trigger('save_failed', response);
return Mercury.notify('Mercury was unable to save to the url: %s', url);
}
};
if (this.options.saveStyle !== 'form') {
options['data'] = jQuery.toJSON(data);
options['contentType'] = 'application/json';
}
return jQuery.ajax(url, options);
};
So your redirect is sent to the success callback, but the page doesn't actually re-render, as with any successful AJAX request. The author discusses overriding this very function here. It also looks like there might be some room to maneuver here by passing a callback function to save.
Btw, another way to do what #corneliusk suggests is:
render { json: {url: post_path(post.slug)} }
Either way, the response body is passed as an argument to the function in the mercury:saved callback.