axios request to docker daemon - javascript

Description:
In graphql gateway i would like to know the services available in docker so that i can stitch the schema from other graphql services. All the applications are running in a docker. docker compose file is used to start all the applications.
Docker engine does provide a REST api to list all the services.
Inside docker compose we should also mount volume
volume
- /var/run/docker.sock:/var/run/docker.sock
Problem:
I used npm library http and i was able to get the result
const result = http.request(
{
socketPath: "/var/run/docker.sock",
path: "/containers/json",
},
(response) => {
res.setEncoding("utf8");
res.on("data", (data) => console.log(data));
res.on("error", (data) => console.error(data));
}
);
result.end();
I am not able to get all the docker services using axios. I also find that even-though axios has socketPath attribute it do not have a path attribute along with it.
I used the following code will using axios:
const axiosResult = await axios({
socketPath: "/var/run/docker.sock",
url: "/containers/json",
method: "GET",
});
I tried most using a different url: http://unix:/var/run/docker.sock/v1.30/containers/json

When using Axios the code has to modified as such to get the result
const { data } = await axios.get("http://unix:/containers/json", {
socketPath: "/var/run/docker.sock",
});
console.log(data);

Related

GET Request repeatedly failed on the front end but not on backend

I'm working on a personal project that will allow users to find new books based on their preferences for the genre. The database I'm using is MongoDB. However, while I'm able to get all the data on the backend using Postman, I can't get it properly displayed on the frontend. At the moment, I'm just trying to get the data sent to the front end and at least console.log'd but it isn't making it that far.
Here is the code in the routes file.
router.get('/books/:genre', bookBuilder.get_some_books)
Here's the code on the backend that the routes file is pointing to and is working:
exports.get_some_books = async function (req, res) {
let { genre } = req.params;
try {
let books = await Book.find({"genre": genre});
if (books) {
res.json(books)
} else {
res.status(404).send({error: 'Not Found'});
}
} catch (err) {
res.status(500).send({error: err.message});
}
}
Here's my code on the frontend that is not working.
async getEverything() {
try {
let pbBooks = await axios.get(`/books/`, {
method: 'GET',
headers: {'Content-Type': 'application/json'},
params: {
genre: 'PB'
}
})
if (pbBooks) {
console.log(pbBooks)
} else {
this.$router.push('/Error');
}
} catch (err) {
console.log(`Network error: ${err.message}`)
}
}
My code stack is Vue.js, Express.js, Node.js and Axios. On the frontend, I've tried making the inner code of axios.get() into '/books/PB' and then tried getEverything(genre) along with /books/${genre} but neither seems to be working.
The error I am getting is a 404 Request Failed error that is from the catch block in the getEverything() function. I'm not sure why the frontend is unable to get the data when the backend works just fine. Is there anything I'm doing wrong?
404 is the HTTP status code for Not found, which implies there is no route setup on localhost for /books. Actually, /books would route to your app, not the backend (unless you have a proxy setup on your app server that redirects to the backend).
If a proxy were involved, it's likely misconfigured. Otherwise, the target URL in the Axios request should be <backend_url>/books (e.g., http://localhost:9999/books with the back running locally on port 9999, and the app on some other port).
Change
let pbBooks = await axios.get(`/books/`, {
...
to
let genre = "PB"
let pbBooks = await axios.get(`/books/${genre}`, {
method: 'GET',
headers: {'Content-Type': 'application/json'}
})
reason is the params part of the config object is converted to query strings (/books?genre=PB) instead of /books/PB, which is what the backend is expecting
More: https://masteringjs.io/tutorials/axios/get-query-params

Netflify lambda function working locally but not in production

I'm trying to use netlify and its lambda function feature to run a node function . Based on https://css-tricks.com/using-netlify-forms-and-netlify-functions-to-build-an-email-sign-up-widget/ , I have in my functions/submission-created.js:
const https = require("https");
exports.handler = async event => {
const email = JSON.parse(event.body).payload.data.EMAIL
const asking = JSON.parse(event.body).payload.data.ASKING
var formData = {
'email': email,
'first_name': '',
'last_name': asking,
'lists[]': 'NUM'
};
var encoded = Object.entries(formData).map(([k, v]) => `${k}=${encodeURIComponent(v)}`).join("&");
var endpoint = 'https://api.sendfox.com/contacts/?' + encoded;
const data = JSON.stringify(formData);
const options = {
method: 'POST',
connection: 'keep-alive',
headers: {
'Authorization': 'Bearer hhhhh',
'Content-Type': 'application/json',
},
'content-length': data.length,
};
console.log(email);
const req = https.request(endpoint, options, (res) => {
console.log('statusCode:', res.statusCode);
console.log('headers:', res.headers);
res.on('data', (d) => {
console.log(d);
});
});
req.on('error', (e) => {
console.error(e);
});
req.write(data);
req.end();
return {
statusCode: 200,
body: data
};
}
This works as expected when I run it locally with netlify dev, but when pushed to the github repo used by netlify to build the site, it does not work in production. How can I fix this?
The package structure looks like the screenshot:
EDIT:
the netlify.toml :
[build]
functions = "./functions
No errors. The output on sites function tab is:
8:57:43 PM: 2020-12-08T01:57:43.384Z undefined INFO to here
8:57:43 PM: 2020-12-08T01:57:43.390Z 8ca26edc-1f20-4c20-b038-
79ecdf206d92 INFO yt#ghj.org
8:57:43 PM: 2020-12-08T01:57:43.390Z 8ca26edc-1f20-4c20-b038-
79ecdf206d92 INFO 999
8:57:43 PM: 2020-12-08T01:57:43.390Z 8ca26edc-1f20-4c20-b038-
79ecdf206d92 INFO yt#ghj.org
8:57:43 PM: Duration: 39.71 ms Memory Usage: 69 MB Init Duration:
176.22 ms
As noted, the information you give are insufficient regarding what has gone wrong. However, I will try to answer based on what I have.
In the tutorial, I noticed the usage of dotenv package.
The package is used in order to setup different configuration in different environments.
It utilizes different production files related to the environments, allowing you for example to setup a .env file for local development, and a different one (e.g.,.env.production) for your production.
Based on the setup, variables setup in the respective .env file utilized on each environment, are loaded in the process.env object.
Now, on your tutorial, I noticed that you have the loading of crucial variables being loaded from .env such as EMAIL_TOKEN. I suspect that your setup expects a separate dotenv file for production - and by not finding it, it loads the parameters required as empty, silently. Please revise what environment it loads and what your configuration is at your environments, respectively.
Also, consider the following tutorial for working with dotenv vars.
This could be because of incorrect settings on the API Gateway. Check what all traffic is allowed. Sometimes you may have just allowed TCP traffic. Change it to all traffic.
I could also be that you have a malformed lambda response. Check this AWS documentation
Resolve malformed 502 Api gateway

Getting ERR_FS_FILE_TOO_LARGE while using unirest file send with put

I am using unirest to upload a file like so
unirest.put(fullUri)
.auth({
user: self.userName,
pass: self.password
})
.header('X-Checksum-Sha1', sha1Hash)
.header('X-Checksum-Md5', md5Hash)
.send(fs.readFileSync(filePath))
.end(function (response) {
This works fine for smaller files but for large files I get ERR_FS_FILE_TOO_LARGE error. I have already tried max_old_space_size without success. Looks like I can fix this by streaming the file but I can't find an api to do that in unirest js library.
Looks like this is an issue with form-data From GitHub Issues
It turns out that the unirest are using the NPM module form-data, and the form-data module requires that if it received a stream that not fs.ReadStream, we need to provide the file information by additional.
Example:
form.append(
'my_file',
stream,
{
filename: 'bar.jpg',
contentType: 'image/jpeg',
knownLength: 19806,
},
)
See: https://github.com/form-data/form-data#void-append-string-field-mixed-value--mixed-options-
Streaming files with unirest is available via the .attach method:
unirest.put(fullUri)
.auth({
user: self.userName,
pass: self.password
})
.header('X-Checksum-Sha1', sha1Hash)
.header('X-Checksum-Md5', md5Hash)
// .send(fs.readFileSync(filePath))
.attach('filename', fs.createReadStream(filePath))
.end(function (response) {
I can't find an API to do that in unirest js library.
That's because there is none: https://github.com/Kong/unirest-nodejs/issues/49:
You can use the underlying request library to do streaming if you want, I am open to a pull request either on this branch or the 1.0 version to add streaming.
Issue is still open.
But from this issue and from the source code you can find out that end() returns request (see https://github.com/request/request)
Unirest.request = require('request')
...
end: function (callback) {
...
Request = Unirest.request($this.options, handleRequestResponse)
Request.on('response', handleGZIPResponse)
...
return Request
}
and from request's source code you can find out that no actual request is sent yet (it's defered). So you can hack into it. And use it's API instead:
const request = unirest.put(constants.server2)
.auth({
user: self.userName,
pass: self.password
})
.header('X-Checksum-Sha1', sha1Hash)
.header('X-Checksum-Md5', md5Hash)
// .send(fs.readFileSync(filePath))
.end(...)
fs.createReadStream(filePath).pipe(request) // just pipe it!
As a side note: unirest is based on request, request is deprecated now. So... maybe you need to steer away from unirest.

What Azure service to use to send daily POST request to Webhook endpoint at specific hour

I need to know what Azure service can I use (I have an Azure account with a lot of credit so I prefer to use it) to send a POST request to a Slack Webhook.
The message is a JSON object which should have randomized content. If done with JS, for example, this would be the code:
const gifsArr = [...] //array with gifs
const textsArr = [...] //array with texts
const titlesArr = [...] //array with titles
const getRandomFromArr = arr => {
return arr[Math.floor(Math.random()*arr.length)];
}
const sendPost = () => {
const endpoint = 'https://hooks.slack.com/...';
const message = {
text: getRandomFromArr(textsArr),
attachments: [
title: getRandomFromArr(titlesArr),
image_url: getRandomFromArr(gifsArr)
]
}
fetch(endpoint, {
method: 'POST',
body: JSON.stringify(message),
headers:{
'Content-Type': 'application/json'
}
});
}
And I need to send it everyday at, say, 10:15 am (this is the part I don't know how to do).
Thanks in advance!
You can either use Azure Logic Apps or Azure Timer Functions
From Docs about Azure Logic Apps:
Azure Logic Apps is a cloud service that helps you schedule, automate,
and orchestrate tasks, business processes, and workflows when you need
to integrate apps, data, systems, and services across enterprises or
organizations.
From Docs about Azure Functions:
A timer trigger lets you run a function on a schedule.
As #Sajeetharan said, I ended up using Azure functions.
I used VS Code Azure Tools extensions to create and selected "Timer" when creating the project as the trigger.
This generates the file function.json, with this code:
{
"bindings": [
{
"name": "myTimer",
"type": "timerTrigger",
"direction": "in",
"schedule": "0 15 10 * * *"
}
]
}
That way the code inside the index.js file will run everyday at 10.15. Also, since it's Node, I had to use https node package instead of fetch (which is part of the browser API).
So, instead of the fetch part, this is what it goes:
var host = 'hooks.slack.com';
var path = '/services/..../...';
var options = {
hostname: host,
path: path,
method: "POST",
headers: {
"Content-Type": "application/json"
}
};
var req = https.request(options);
req.write(JSON.stringify(message));
req.end();

How to pass Request cookies through node-fetch in isomorphic app?

I'm trying to build isomorphic project using React, Express and isomorphic fetch (based on whatwg-fetch on client and node-fetch on server), from this common boilerplate. I'm using cookies for my access token, and credentials: 'same-origin' on front-end side to send it to GraphQL -- works pretty well.
The problem is that I can't use the same solution for server side -- node-fetch just don't support using of XMLHttpRequest cookies from the box. My fetch request is under few abstract layers from router, so I can't just use cookie value from req.
Here is my server.js code (full version):
server.get('*', async (req, res, next) => {
try {
// some presettings here..
await Router.dispatch({ path: req.path, query: req.query, context }, (state, component) => {
data.body = ReactDOM.renderToString(component);
});
res.send(template(data));
} catch (err) {
next(err);
}
});
and Route's index.js (full version):
export const action = async (state) => {
const response = await fetch('/graphql?query={me{id,email}}', {
credentials: 'same-origin',
});
const { data } = await response.json();
// ...
return <Login title={title} me={data.me} />;
};
How can I pass my token from server.js to my fetch module? Or, maybe there are some better decisions?
First off, I hope you have found an answer by now!
Secondly, cookies are really just headers. If you need to send a cookie to authorize server-side requests, you can always just create the string that you need for the cookie value and send it as a header.
For an example, take a look at how this server-side node-fetch wrapper appends saved cookies to the outbound request: https://github.com/valeriangalliat/fetch-cookie/blob/master/index.js#L17

Categories

Resources