Consume JSON from a URL in Express - javascript

The question is: How can I import json from a URL specifically, NOT an internal file in Express, and contain it such that I can use it across multiple views. For example, I have a controller. How can I get in in there (controller)? I am using request.
I have a router with 2 routes but I want to have a bunch more, and the bulk of the logic for the routes is being done in controllers.
Below is a controller with the route for showing all. I had hardcoded a small piece of "json" in it as data to use temporarily, but now I want to populate my view via an outside api. This is my controller:
module.exports = {
//show all USERS
showDogs: (req,res) => {
const dogs = [
{
name:"Fluffy", breed:"ChowChow", slug:"fluffy", description:"4 year old Chow. Really, really fluffy."
},
{
name:"Buddy", breed:"White Lab", slug:"buddy", description:"A friendly 6 year old white lab mix. Loves playing ball"
},
{
name: "Derbis", breed:"Schmerbis",slug:"derbis", description:"A real Schmerbis Derbis"
}
];
res.render("pages/dogs", {dogs: dogs, title:"All Dogs"});
}
};
How can I get this json the data to come from an outside line? I have used request before but I don't know how to transfer the data between files. I don't want to put it inside the showDogs or it won't be accessible to other functions here. Right?
I had something like this below, with require('request') at the top of the controller, but it just gave errors.
const options = {
url:'https://raw.githubusercontent.com/matteocrippa/dogbreedjsondatabase/master/dog-breed.json',
method:'GET',
headers:{
'Accept-Charset': "utf-8"
NO IDEA ABOUT THIS AREA FOR NOW EITHER
}
I also tried wrapping the entire thing, all the functions, in a request:
request('https://raw.githubusercontent.com/matteocrippa/dogbreedjsondatabase/master/dog-breed.json', function(error, response, body)
But still I got an error.
And this the route.js where the controller sends:
//dogs
router.get('/dogs', dogsController.showDogs)
I am a Node beginner so the only thought I have is to write some middleware. The deeper problem here is I don't know how to use/write middleware properly. Perhaps I can become informed.

Add a utility file that contains the code to talk to the external API. Include this file and use it's function to get dogs data. Later, you can add more functions for other APIs as well.
const getDogData = require('../externalApis').getDogData;
module.exports = {
//show all USERS
showDogs: (req, res) => {
getDogData(function(err, dogs) {
if (err) {
//handle err
} else {
res.render("pages/dogs", {
dogs: dogs,
title: "All Dogs"
});
}
}
}
};
// externalApis.js
const request = require ('request');
module.exports = {
getDogData: function(done) {
const options = {
url: 'https://raw.githubusercontent.com/matteocrippa/dogbreedjsondatabase/master/dog-breed.json',
method: 'GET',
headers: {
'Accept-Charset': "utf-8"
}
}
request(options, function(error, response, body) {
if (error) {
return done(err);
} else {
var data = JSON.parse(body); // not sure how's data is returned or if it needs JSON.parse
return done(null, data.dogs); //return dogs
}
});
}

Related

Append a query param to a GET request?

I'm trying to make a simple API that calls another API that will return some information. The thing is, in order to connect to the second API, I need to attach query parameters to it.
So what I've tried to do so far is to use an axios.get in order to fetch the API. If I didn't need to add queries on top of that, then this would be really simple but I'm having a really hard time trying to figure out how to attach queries on top of my request.
I've created an object that pulled the original query from my end and then I used JSON.stringify in order to turn the object I made into a JSON. Then, from my understanding of Axios, you can attach params my separating the URL with a comma.
On line 6, I wasn't sure if variables would carry over but I definitely can't have the tag var turned into the string "tag", so that's why I left it with the curly brackets and the back ticks. If that's wrong, then please correct me as to how to do it properly.
the var tag is the name of the query that I extracted from my end. That tag is what needs to be transferred over to the Axios GET request.
app.get('/api/posts', async (req, res) => {
try {
const url = 'https://myurl.com/blah/blah';
let tag = req.query.tag;
objParam = {
tag: `${tag}`
};
jsonParam = JSON.stringify(objParam);
let response = await axios.get(url, jsonParam);
res.json(response);
} catch (err) {
res.send(err);
}
});
response is SUPPOSED to equal a JSON file that I'm making the request to.
What I'm actually getting is a Error 400, which makes me think that somehow, the URL that Axios is getting along with the params aren't lining up. (Is there a way to check where the Axios request is going to? If I could see what the actual url that axios is firing off too, then it could help me fix my problem)
Ideally, this is the flow that I want to achieve. Something is wrong with it but I'm not quite sure where the error is.
-> I make a request to MY api, using the query "science" for example
-> Through my API, Axios makes a GET request to:
https://myurl.com/blah/blah?tag=science
-> I get a response with the JSON from the GET request
-> my API displays the JSON file
After looking at Axios' README, it looks like the second argument needs the key params. You can try:
app.get('/api/posts', async (req, res, next) => {
try {
const url = 'https://myurl.com/blah/blah';
const options = {
params: { tag: req.query.tag }
};
const response = await axios.get(url, options);
res.json(response.data);
} catch (err) {
// Be sure to call next() if you aren't handling the error.
next(err);
}
});
If the above method does not work, you can look into query-string.
const querystring = require('query-string');
app.get('/api/posts', async (req, res, next) => {
try {
const url = 'https://myurl.com/blah/blah?' +
querystring.stringify({ tag: req.params.tag });
const response = await axios.get(url);
res.json(response.data);
} catch (err) {
next(err);
}
});
Responding to your comment, yes, you can combine multiple Axios responses. For example, if I am expecting an object literal to be my response.data, I can do:
const response1 = await axios.get(url1)
const response2 = await axios.get(url2)
const response3 = await axios.get(url3)
const combined = [
{ ...response1.data },
{ ...response2.data },
{ ...response3.data }
]

NodeJS - Request file and zip it

I am currently in the process of creating a REST API for my personal website. I'd like to include some downloads and I would like to offer the possibility of selecting multiple ones and download those as a zip file.
My first approach was pretty easy: Array with urls, request for each of them, zip it, send to user, delete. However, I think that this approach is too dirty considering there are things like streams around which seems to be quite fitting for this thing.
Now, I tried around and am currently struggling with the basic concept of working with streams and events throughout different scopes.
The following worked:
const r = request(url, options);
r.on('response', function(res) {
res.pipe(fs.createWriteStream('./file.jpg'));
});
From my understanding r is an incoming stream in this scenario and I listen on the response event on it, as soon as it occurs, I pipe it to a stream which I use to write to the file system.
My first step was to refactor this so it fits my case more but I already failed here:
async function downloadFile(url) {
return request({ method: 'GET', uri: url });
}
Now I wanted to use a function which calls "downloadFile()" with different urls and save all those files to the disk using createWriteStream() again:
const urls = ['https://download1', 'https://download2', 'https://download3'];
urls.forEach(element => {
downloadFile(element).then(data => {
data.pipe(fs.createWriteStream('file.jpg'));
});
});
Using the debugger I found out that the "response" event is non existent in the data object -- Maybe that's already the issue? Moreover, I figured that data.body contains the bytes of my downloaded document (a pdf in this case) so I wonder if I could just stream this to some other place?
After reading some stackoveflow threads I found the following module: archiver
Reading this thread: Dynamically create and stream zip to client
#dankohn suggested an approach like that:
archive
.append(fs.createReadStream(file1), { name: 'file1.txt' })
.append(fs.createReadStream(file2), { name: 'file2.txt' });
Making me assume I need to be capable of extracting a stream from my data object to proceed.
Am I on the wrong track here or am I getting something fundamentally wrong?
Edit: lmao thanks for fixing my question I dunno what happened
Using archiver seems to be a valid approach, however it would be advisable to use streams when feeding large data from the web into the zip archive. Otherwise, the whole archive data would need to be held in memory.
archiver does not support adding files from streams, but zip-stream does. For reading a stream from the web, request comes in handy.
Example
// npm install -s express zip-stream request
const request = require('request');
const ZipStream = require('zip-stream');
const express = require('express');
const app = express();
app.get('/archive.zip', (req, res) => {
var zip = new ZipStream()
zip.pipe(res);
var stream = request('https://loremflickr.com/640/480')
zip.entry(stream, { name: 'picture.jpg' }, err => {
if(err)
throw err;
})
zip.finalize()
});
app.listen(3000)
Update: Example for using multiple files
Adding an example which processes the next file in the callback function of zip.entry() recursively.
app.get('/archive.zip', (req, res) => {
var zip = new ZipStream()
zip.pipe(res);
var queue = [
{ name: 'one.jpg', url: 'https://loremflickr.com/640/480' },
{ name: 'two.jpg', url: 'https://loremflickr.com/640/480' },
{ name: 'three.jpg', url: 'https://loremflickr.com/640/480' }
]
function addNextFile() {
var elem = queue.shift()
var stream = request(elem.url)
zip.entry(stream, { name: elem.name }, err => {
if(err)
throw err;
if(queue.length > 0)
addNextFile()
else
zip.finalize()
})
}
addNextFile()
})
Using Async/Await
You can encapsulate it into a promise to use async/await like:
await new Promise((resolve, reject) => {
zip.entry(stream, { name: elem.name }, err => {
if (err) reject(err)
resolve()
})
})
zip.finalize()

Access API endpoints in MEANjs from server controller

so i have this problem i am working on 'following' feature in my application. What's important, i have two models:
Follows and Notifications
When I hit follow button in front-end I run function from follow.client.controller.js which POSTs to API endpoint /api/follows which corresponds to follow.server.controller.js and then update action on Follows model is performed - easy. AFAIK thats how it works (and it works for me).
But in follows.server.controller.js I want also invoke post to API endpoint at /api/notifications which corresponds to notifications.server.controller.js but I can't find a proper way to do that. Any help will be appreciated.
I don't want another call from front-end to add notification because it should be automatic = if user starts following someone, information is saved in both models at once.
You can add middleware in your server route.
app.route('/api/follows')
.post(notification.firstFunction, follows.secondFunction);
And now add 2 methods in your contollers. First makes the call to db and add's some result's data to request object which will be forwarded to second method.
exports.firstFunction= function(req, res, next) {
Notification.doSometing({
}).exec(function(err, result) {
if (err) return next(err);
req.yourValueToPassForward = result
next(); // <-- important
});
};
exports.secondFunction= function(req, res) {
//...
};
Or you can make few database calls in one api method, joining this calls with promises. Example:
var promise = Meetups.find({ tags: 'javascript' }).select('_id').exec();
promise.then(function (meetups) {
var ids = meetups.map(function (m) {
return m._id;
});
return People.find({ meetups: { $in: ids }).exec();
}).then(function (people) {
if (people.length < 10000) {
throw new Error('Too few people!!!');
} else {
throw new Error('Still need more people!!!');
}
}).then(null, function (err) {
assert.ok(err instanceof Error);
});

Saving to MongoDB when Socket.io event is emitted

I'm utilizing a MEAN stack and Socket.io to pull images from the real-time Instagram API. Everything is working great, but I now want to begin saving image data to a MongoDB database so I have a "history" of images from locations (rather than simply the most recent photos).
Below is the relevant (working) code I have so far:
Node server-side code to handle new photo updates from Instagram API and emit event to Angular controller:
// for each new post Instagram sends the data
app.post('/callback', function(req, res) {
var data = req.body;
// grab the object_id (as geo_id) of the subscription and send as an argument to the client side
data.forEach(function(data) {
var geo_id = data.object_id;
sendUpdate(geo_id);
});
res.end();
});
// send the url with the geo_id to the client side
// to do the ajax call
function sendUpdate(geo_id) {
io.sockets.emit('newImage', { geo_id: geo_id });
}
Angular controller code when 'newImage' event is received:
socket.on('newImage', function(geo_id) {
// pass geo_id into Instagram API call
Instagram.get(geo_id).success(function(response) {
instagramSuccess(response.geo_id, response);
});
// Instagram API callback
var instagramSuccess = function(scope,res) {
if (res.meta.code !== 200) {
scope.error = res.meta.error_type + ' | ' + res.meta.error_message;
return;
}
if (res.data.length > 0) {
$scope.items = res.data;
} else {
scope.error = "This location has returned no results";
}
};
});
Angular factory to handle calls to Instagram API:
angular.module('InstaFactory', []).factory('Instagram', function($http) {
var base = "https://api.instagram.com/v1";
var client_id = 'MY-CLIENT-ID';
return {
'get': function(geo_id) {
var request = '/geographies/' + geo_id.geo_id + '/media/recent?client_id=' + client_id;
var url = base + request;
var config = {
'params': {
'callback': 'JSON_CALLBACK'
}
};
return $http.jsonp(url, config);
}
};
});
I also have the following Angular Controller which currently GETS details of each location from my Stadia mongoDB model. This model also contains an (empty for now) 'photos' array that I want to PUSH photo details (url, username, user profile url, etc.) onto each time I receive them from Instagram:
angular.module('StadiaFactory', []).factory('Stadia', function($http) {
var base = "http://localhost:6060/api/stadia/";
return {
'get': function(id) {
var request = id;
var url = base + request;
var config = {
'params': {
'callback': 'JSON_CALLBACK'
}
};
return $http.jsonp(url, config);
}
};
});
This is where I get confused. Where do I fire off the PUT request to my Stadia API and does this Node route for my Stadia API look reasonable? Note: I omitted my GET route which works perfectly. PUT is just throwing me for a loop:
// add photos to stadium photos array
app.put('/api/stadia/:stadium_id', function(req, res) {
// use mongoose to get and update stadium
Stadium.findByIdAndUpdate(req.params.stadium_id,
{$push: {"photos": {img: ?, link: ?, username: ?, profile_picture: ?}}},
{safe: true, upsert: true},
function(err, stadium) {
// if there is an error retrieving, send the error. nothing after res.send(err) will execute
if (err)
res.send(err)
res.jsonp(stadium); // return stadium in JSON format
});
});
Well there are a few problems with your current structure.
When your callback route is called, with a possibility of N objects in it, you're triggering your socket event and retrieving all the last photos of your geography each time. So let's say you will have 3 new objects, you will call 3 times the same thing to get the same data, which is a bit loss when you have the power of the sockets.
You can also have problems if you try to get the object data from the client-side and PUTing it to your server, since all your clients may receive the socket and you could end-up with duplicates, not to mention that this is a lot of traffic for not much, and this will burn your quota API limit, which is also not safe on the client-side since everyone can see your key.
To me, a good way to get something working (even if I don't really know what your :stadium_id param stands for) is to get the info you want directly on the server side in your callback using the request module.
You should only get the pictures, because you can retrieve a lot of things like users, tags or videos that you may don't want to get. So you will have to listen for the image objects, and nothing else.
You could have something like this:
var request = require('request');
var CLIENT_ID = 'yourId';
function newImage(data) {
io.sockets.emit('newImage', data);
}
app.post('/callback', function (req, res) {
//loop in all the new objects
req.body.forEach(function (data) {
if (data.type !== 'image') { return ; }
//BTW I think you should try with the id property instead of object_id
request('https://api.instagram.com/v1/media/' + data.object_id + '?access_token=' + CLIENT_ID,
function (error, response, body) {
if (error) { return ; }
//Here we have one JSON object with all the info about the image
var image = JSON.parse(body);
//Save the new object to your DB. (replace the STADIUM_ID)
Stadium.findByIdAndUpdate(STADIUM_ID, { $push: {'photos':
{ img: image.images.standard_resolution.url,
link: image.link,
username: image.user.username,
profile_picture: image.user.profile_picture
}}},
{ safe: true, upsert: true });
//Send a socket to your client with the new image
newImage({
id: image.id,
img: image.images.standard_resolution.url,
link: image.link,
username: image.user.username,
profile: image.user.profile_picture
});
}
});
res.end();
});
And then in your client, you will only have to push the new images received in the newImage socket event in the $scope.items.

Node API design and re-using code

My API has three endpoints: articles, websites, and users. Each article is associated with a website. A user can also share articles.
In my API, I have just created an endpoint at /website/:id/articles. This will query the database for articles associated with the given website. It then performs some manipulation on the data for each article based on who is talking to the API ("has the user shared this article?", for example).
I am now moving on to create a similar endpoint at /users/:id/shared-articles. The database query for this is slightly different, but the manipulation I want to perform on the articles data following the query is the same as before.
Here is some pseudo code for the former endpoint:
router.get('/websites/:id/articles', function (req, res) {
articleService.find({ websiteId: req.params.id }, function (error, foundArticles) {
async.waterfall([
function (cb) {
// Manipulate foundArticles…
cb(null, manipulatedArticles)
},
function (articles, cb) {
// Manipulate articles some more…
cb(null, manipulatedArticles)
},
], function (error, articles) {
if (error) {
return res.json(error, 400)
}
res.json(articles)
})
})
})
To create my new endpoint, /users/:id/shared-articles, I could abstract the manipulation tasks into a function that can be shared by both of my endpoints (the waterfall seen above), reducing code repetition.
router.get('/websites/:id/articles', function (req, res) {
articleService.find({ websiteId: req.params.id }, function (error, foundArticles) {
manipulateArticles(foundArticles, function (articles) {
if (error) {
return res.json(error, 400)
}
res.json(articles)
})
})
})
router.get('/users/:id/shared-articles', function (req, res) {
shareActionService.find({ userId: req.params.id }, function (error, foundShareActions) {
var sharedArticleIds = { _id: { $in: _.pluck(foundShareActions, 'sharedArticleId') } }
articleService.find(sharedArticleIds, function (error, foundArticles) {
manipulateArticles(foundArticles, function (articles) {
if (error) {
return res.json(error, 400)
}
res.json(articles)
})
})
})
})
However, I figured that this sort of code re-use problem must be common when designing APIs in Node, and I would like to know if there is an obviously better solution that I am missing here.
One idea I had would be to have all article sub-resources (such as /users/:id/shared-articles or /websites/:id/links) talk to the /links API internally, which itself would deal with the manipulation I mention above. The problem then is that I would have to make /links very verbose in the query headers/parameters it needs, in order to allow for the different database queries needed (such as those by the two sub-resource endpoints demonstrated here).
Is there a better solution/abstraction here?
You can create a "service" layer. Abstract the link manipulation into a completely separate file and call it from each of the routes.
Create a service/links.js:
module.exports = {
manipulateLinks: function (response) {
// Manipulate code
return response
}
}
Then in you routes, call the function:
var linkservice = require('../service/links')
var response = linkservice.manipulateLinks(response)

Categories

Resources