How I can use jQuery on electron main process?
It seems every example I find is for renderer process.
Example I want to create a util that will be used by the main process, that will fetch data from an api using get.
Then using $.get makes an error that get is not a function.
Thanks.
jQuery is a JS library for the browser, eg DOM manipulating, etc. You shouldn't use that in the main process, since the main process is running in NodeJS.
It's hard to propose a solution without knowing more about your application. If you need the data from the AJAX request in your main process, you can use NodeJS https package. Example from Twilio blog:
const https = require('https');
https.get('https://api.nasa.gov/planetary/apod?api_key=DEMO_KEY', (resp) => {
let data = '';
// A chunk of data has been recieved.
resp.on('data', (chunk) => {
data += chunk;
});
// The whole response has been received. Print out the result.
resp.on('end', () => {
console.log(JSON.parse(data).explanation);
});
}).on("error", (err) => {
console.log("Error: " + err.message);
});
Edit:
As #Hans-Koch mentioned, you probably shouldn't use jQuery in the renderer process either since one of it's main purpose is to normalize the API for DOM manipulation, AJAX, etc. and in Electron you only have to support Chromium. If you want to make AJAX request you can use the XMLHttpRequest or some npm package which wraps it, eg xhr.
Related
In my sveltekit app I make AJAX calls to my api endpoints. For example:
+page.svelte
<script>
async function get_card() {
const url = '/api/card/?category=' + $page.params.slug;
const response = await fetch(url, {
method: 'GET',
})
const card = await response.json();
return card;
}
</script>
In the browser javascript console I get this warning:
Loading /api/card/?category=Neurology using `window.fetch`.
For best results, use the `fetch` that is passed to your `load`
function: https://kit.svelte.dev/docs/load#making-fetch-requests
But as far as I can tell, that fetch function is only accessible to me on the server, and I do not see a way to use it in a script that may run on the client (such as +page.svelte). I tried passing the function as part of the data object from load:
+layout.server.js
export const load = async ({ fetch, locals }) => {
return {
email: locals.user.email,
group: locals.user.group,
fetch: fetch
}
}
But, not surprisingly, that does not work since the function is not serializable.
Am I Doing It Wrong™, or should I just ignore the warning?
fetch is originally a browser API and SvelteKit defines it on the server as well, if it does not exist. The warning is there to tell you that you are creating another round trip to the server (one for the page and one for the data) when you possibly could have loaded the data on the server so it could be transmitted as part of the page (during server-side rendering).
If the code of your function is not executed right away, then this is a false positive (recent issue on this). I.e. if the data should be requested at a significantly later point, there is no way to bundle the request with the page.
(You are definitely not meant to pass on the fetch of load, you are supposed to use it to get the data.)
I'm wondering if there's any way to listen for console messages and act on console messages when they're received. Mainly, is there any way to do this without an external module, and using the http module?
The goal is to trigger a NodeJS function or code snippet on an event like click in the HTML. If there's also a way to do this, then that's great. But once again, I'd like to do this without an external module, and just use those that are built-in to NodeJS.
Use onclick() function in JavaScript to trigger a function call when clicking on a element. Then use fetch to make a api call to the nodejs server.
I know #Haris Wilson already got the answer, but I'd just like to provide a code example.
Instead of trying to catch a console message and then execute a function if we find it, we can use fetch() to make a request to whatever URL we need, and this can allow us to make other requests.
In this case, we can use the url module and the http module to parse the url and serve the API and website, respectively.
const url = require('url')
const http = require('http')
const requestListener = async function (req, res) {
// Basic server setup
res.writeHead(200, {
'Content-Type': 'text/html'
});
res.end(/** Content here */)
// API
if (url.parse(req.url, true).pathname === '/APIcall') {
let arguments = url.parse(req.url, true).query
// Preform necassary actions here
}
}
We can now use onClick to call a function inside our webpage JavaScript, and use fetch([API URL]) to give our NodeJS data to preform an action. We can use URL params to do this, such as https://localhost:8080/APIcall?data=someData&moreParam=more-data, where ?data=someData&moreParam=more-data are the URL params.
After rendering my index.html (which works fine), I would like to send some additional data via sockets. For that, I would need a promise for the rendering process. At the moment, the code runs synchron. The socket data is sent and moments later the data is overwritten due to the later ending rendering process. Looking for something like:
res.render('index', {title: "XYZ"})
.then(function(){
//.. do something
});
Is there a different approach? Or is the only solution to ask for the data via the client?
Thanks for any help!
Does the render function has a promise?
The documentation doesn't mention one, so presumably not.
For that, I would need a promise for the rendering process.
Not necessarily, just some kind of notification that the data had been sent. Promises are one kind of notification, but not the only kind.
The docmentation shows that render will call a callback function with the rendered HTML, so you could use that callback to send the HTML along with whatever you want to have follow it:
res.render("index", {title: "XYZ"}, function (err, html) {
if (err) {
// ...send an error response...
return;
}
res.send(html);
// ...send your other stuff here...
});
But if you want a promise, you could use util.promisify on res.render. It's a bit of a pain because promisify doesn't make handling this straightforward, so you have to use bind:
const resRender = util.promisify(res.render.bind(res));
// ...
resRender("index", {title: "XYZ"})
.then(html => {
res.send(html);
// ...send your other stuff here...
})
.catch(err => {
// ...send an error response...
});
You've said you're sending further information "via sockets." That makes it sound to me like the further information you're sending isn't being sent via the res response, but via a separate channel.
If so, and if you want to wait to send that until the response is sent, you can start your socket sending in response to the finish event on the response:
res.on("finish", () => {
// Send your socket message here
});
res.render("index", {title: "XYZ"});
(Remember that an Express Response object is an enhanced version of the Node.js ServerResponse object, which is what provides this event.)
But even then, all that means is that the data has been handed over to the OS for transmission to the client. From the documentation:
...this event is emitted when the last segment of the response headers and body have been handed off to the operating system for transmission over the network. It does not imply that the client has received anything yet.
I don't think you have anything beyond that to hook into.
BACK STORY :
Let me come from my problem, I need to update firebase database with Arduino so I used firebase-Arduino library but for some reason it will not compile Node MCU so my next way is a bit complicated that is I created a java script to update the firebase I just need to add 1 to the database so I don't need to update sensor value or anything so if I load the webpage it will update the value ,I thought it will be triggered with http request from Arduino but I was wrong it does not work like that.
QUESTION : How to run the JavaScript in a webpage with a web request from Arduino?
Assuming you have node.js installed you can have something like this (source):
const https = require('https');
https.get('your_url_here', (resp) => {
let data = '';
// A chunk of data has been recieved.
resp.on('data', (chunk) => {
data += chunk;
});
// The whole response has been received. Print out the result.
resp.on('end', () => {
console.log(JSON.parse(data).explanation);
});
}).on("error", (err) => {
console.log("Error: " + err.message);
});
But if you don't have installed node.js you might create the http request from bash commands like curl. This can be useful since you can make it run as daemon (run on th background every X minutes).
Let me know if you managed, something good luck.
My Use-case is as follows:
I make plenty of rest API calls from my node server to public APIs. Sometime the response is big and sometimes its small. My use-case demands me to stringify the response JSON. I know a big JSON as response is going to block my event loop. After some research i decided to use child_process.fork for parsing these responses, so that the other API calls need not wait. I tried sending a big 30 MB JSON file from my main process to the forked child_process. It takes so long for the child process to pick and parse the json. The response im expecting from the child process is not huge. I just want to stringify and get the length and send back to the main process.
Im attaching the master and child code.
var moment = require('moment');
var fs = require('fs');
var process = require('child_process');
var request = require('request');
var start_time = moment.utc().valueOf();
request({url: 'http://localhost:9009/bigjson'}, function (err, resp, body) {
if (!err && resp.statusCode == 200) {
console.log('Body Length : ' + body.length);
var ls = process.fork("response_handler.js", 0);
ls.on('message', function (message) {
console.log(moment.utc().valueOf() - start_time);
console.log(message);
});
ls.on('close', function (code) {
console.log('child process exited with code ' + code);
});
ls.on('error', function (err) {
console.log('Error : ' + err);
});
ls.on('exit', function (code, signal) {
console.log('Exit : code : ' + code + ' signal : ' + signal);
});
}
ls.send({content: body});
});
response_handler.js
console.log("Process " + process.argv[2] + " at work ");
process.on('message', function (json) {
console.log('Before Parsing');
var x = JSON.stringify(json);
console.log('After Parsing');
process.send({msg: 'Sending message from the child. total size is' + x.length});
});
Is there a better way to achieve what im trying to do? On one hand i need the power of node.js to make 1000's of API calls per second, but sometimes i get a big JSON back which screws things up.
Your task seems to be both IO-bound (fetching 30MB sized JSON) where Node's asynchronicity shines, as well as CPU-bound (parsing 30MB sized JSON) where asynchronicity doesn't help you.
Forking too many processes soon becomes a resource hog and degrades performance. For CPU-bound tasks you need just as many processes as you have cores and no more.
I would use one separate process to do the fetching and delegate parsing to N other processes, where N is (at most) the number of your CPU cores minus 1 and use some form of IPC for the process communication.
One choice is to use Node's Cluster module to orchestrate all of the above: https://nodejs.org/docs/latest/api/cluster.html
Using this module, you can have a master process create your worker processes upfront and don't need to worry when to fork, how many processes to create, etc. IPC works as usual with process.send and process.on. So a possible workflow is:
Application startup: master process creates a "fetcher" and N "parser" processes.
fetcher is sent a work list of API endpoints to process and starts fetching JSON sending it back to master process.
on every JSON fetched the master sends to a parser process. You could use them in a round-robin fashion or use a more sophisticated way of signalling to the master process when a parser work queue is empty or is running low.
parser processes send the resulting JSON object back to master.
Note that IPC also has non-trivial overhead, especially when send/receiving large objects. You could even have the fetcher do the parsing of very small responses instead of passing them around to avoid this. "Small" here is probably < 32KB.
See also: Is it expensive/efficient to send data between processes in Node?