JavaScript fetch is delayed - javascript

I have an Express server waiting for my website to do something. When my site does something, a shell script should be called on the Express server. The problem is: The shell script is only run after the "confirm window" has been accepted or denied. I want the fetch to happen as soon as possible. I wouldn't even need to get anything from the Express server, I just want to signal Express to run the shell script as soon as possible.
I have this code on the website:
messaging.onMessage(function (payload){
fetch("http://localhost:9000/testAPI")
.then(res => res.text())
.then(res => console.log("something:" + res));
var r = confirm(callingname + " is calling.");
if (r == true) {
window.open(payload.data.contact_link, "_self");
} else {
console.log("didn't open");
}
});
I have this code on the backend:
var express = require("express");
var router = express.Router();
router.get("/", function(req,res,next){
const { exec } = require('child_process');
exec('bash hi.sh',
(error, stdout, stderr) => {
console.log(stdout);
console.log(stderr);
if (error !== null) {
console.log(`exec error: ${error}`);
}
});
res.send("API is working");
});
module.exports = router;

confirm() is blocking, and you only have a single thread. This means confirm() will stop the world for your application, preventing fetch() from doing anything.
As the simplest possible fix, you can try delaying the moment when confirm() is invoked. This would allow fetch() to get the request out.
messaging.onMessage(function (payload) {
fetch("http://localhost:9000/testAPI")
.then(res => res.text())
.then(text => console.log("something:" + text));
setTimeout(function () {
if (confirm(`${callingname} is calling.`)) {
window.open(payload.data.contact_link, "_self");
} else {
console.log("didnt open");
}
}, 50);
});
Other options would be to put confirm() into one of the .then() callbacks of fetch, or to use a non-blocking alternative to confirm(), as suggested in the comments.

Related

res.redirect not workig when working with React

I am using react as frontend and made api using express I have the following code I have stored jwt token in the cookies while logging in for the first then when trying to login in again I check if there is already a token in the cookies if there is token in the cookie (currently I am not verifying it I just want it to work) redirect the user to profile page but it doesn't work.
Although an XMLHttpRequest can be seen in the network tab (click for screenshot) but it doesn't work.
PS - I am using Axios in the frontend to make a get request.
loginRouter.get("/", async (req, res) => {
try {
const cookieFound = req.cookies["login-token"];
if (cookieFound) {
res.redirect("profile");
} else {
res.redirect("login");
}
} catch (error) {
console.log(error);
res.status(500).json("Ooops something went wrong!");
}
});
code to make a get request in the frontend
useEffect(() => {
Axios.get("/login");
}, []);
EDIT :-
Backend
loginRouter.get("/", async (req, res) => {
try {
const cookieFound = req.cookies["login-token"];
if (cookieFound) {
res.send("/profile");
}
// res.status(200).json(cookieFound);
} catch (error) {
console.log(error);
res.status(500).json("Ooops something went wrong!");
}
});
Client
useEffect(() => {
const alreadyLoggedIn = async () => {
const url = await Axios.get("/login");
window.location = url.data;
};
alreadyLoggedIn();
}, []);
To what you have entered I think you should change window.location = url.data to window.location = window.location.hostname + url.data;
In your current setup the total url will be set to /profile while you want yourwebsite.com/profile

Timeout acquiring a connection when streaming results using Express

We use the following code to stream the results of a query back to the client:
app.get('/events', (req, res) => {
try {
const stream = db('events')
.select('*')
.where({ id_user: 'foo' })
.stream()
stream.pipe(JSONStream.stringify()).pipe(res)
} catch (err) {
next(err)
}
})
While the code seems to have an excellent memory usage profile (stable/low memory usage) it creates random DB connection acquisition timeouts:
Knex: Timeout acquiring a connection. The pool is probably full. Are
you missing a .transacting(trx) call?
This happens in production at seeming random intervals. Any idea why?
This happens because aborted requests (i.e client closes the browser mid-request) don't release the connection back to the pool.
First, ensure you're on the latest knex; or at least v0.21.3+ which has introduced fixes to stream/pool handling.
From the on you have a couple options:
Either use stream.pipeline instead of stream.pipe which handles aborted requests correctly like so:
const { pipeline } = require('stream')
app.get('/events', (req, res) => {
try {
const stream = db('events')
.select('*')
.where({ id_session: req.query.id_session })
.stream()
return pipeline(stream, JSONStream.stringify(), res, err => {
if (err) {
return console.log(`Pipeline failed with err:`, err)
}
console.log(`Pipeline ended succesfully`)
})
} catch (err) {
next(err)
}
})
or listen to the [close][close] event on req and destroy the DB stream yourself, like so:
app.get('/events', (req, res) => {
try {
const stream = db('events')
.select('*')
.where({ id_session: req.query.id_session })
.stream()
// Not listening to this event will crash the process if
// stream.destroy(err) is called.
stream.on('error', () => {
console.log('Stream was destroyed')
})
req.on('close', () => {
// stream.end() does not seem to work, only destroy()
stream.destroy('Aborted request')
})
stream.pipe(JSONStream.stringify()).pipe(res)
} catch (err) {
next(err)
}
})
Useful reading:
knex Wiki: Manually close streams. Careful, the stream.end mentioned here doesn't seem to work.
knex Issue: stream.end() does not return connection to pool

Firebase Functions: Random 404's

I'm using firebase functions on a server for API calls. Everything works fine 70% of the time, but all of a sudden some of my function calls start failing to execute, giving my API a 404, and don't work for the next few hours.
In my StackDriver I can see the function isn't called again when I try. My API just gives me a 404 without ever reaching the server.
Below is one of the calls that fails once in a while. Going to the URL i'm fetching, the GET result always shows up, so I have no clue what the issue is.
API call:
const getCreators = () => {
return window
.fetch(url + '/get-creators', {
method: 'GET',
headers: {
'Content-Type': 'application/json',
},
})
.then((res) => {
console.log(res);
if (res.status === 200) {
return res.json();
} else {
return null;
}
})
.then((data) => {
if (!data || data.error) {
return null;
} else {
return data;
}
});
};
Server code:
const app = express();
app.get('/get-creators', async (req, res) => {
console.log('creators: ');
creators
.find()
.toArray()
.then((result) => {
console.log(result);
res.status(200).send(result);
})
.catch(() => {
console.log('error');
res.send('error');
});
});
app.listen(4242, () => console.log(`Node server listening at https ${4242}!`));
exports.app = functions.https.onRequest(app);
Found it. You don't want the below code on your server:
app.listen(4242, () => console.log(`Node server listening at https ${4242}!`));
I commented this code out, republished, and all is well.
I thought having this didn't make a difference, but apparently once in a blue moon it can and will try to make the server listen locally, which gave me a 404.

Electron interceptBufferProtocol proxying requests does not work

Using interceptBufferProtocol, I can successfully intercept the loadURL event to https://google.com mainWindow.loadURL("https://google.com/"); and replace it with my custom HTML code. The HTML code has an iframe which I am trying to proxy. This can usually be achieved by setting the electron browserWindow proxy but in my case, it fails to work. I set the proxy with the following code:
mainWindow.webContents.session.setProxy({
proxyRules: "http://" + proxy
}, () => {
console.log('Proxy: http://' + proxy)
})
Intercept url code:
ses.protocol.interceptBufferProtocol('https', (req, callback) => {
ses.resolveProxy(req.url, (x) => {
console.log(x)
})
if (req.url == "https://google.com/") {
fs.readFile(path.join(__dirname, "/../../path/stuff.html"), 'utf8', function(err, html) {
callback(Buffer.from(html, 'utf8'));
});
} else {
const request = net.request(req)
request.on('response', res => {
const chunks = []
res.on('data', chunk => {
chunks.push(Buffer.from(chunk))
})
res.on('end', async () => {
const file = Buffer.concat(chunks)
callback(file)
})
})
if (req.uploadData) {
req.uploadData.forEach(part => {
if (part.bytes) {
request.write(part.bytes)
} else if (part.file) {
request.write(fs.readFileSync(part.file))
}
})
}
request.end()
}
})
However, no matter what I do, it appears to use my local IP instead of a proxy. Do I have any options?
The code runs fine without a proxy. I'm trying to run it with one. The problem lies within the .interceptBufferProtocol() function. Any help would be appreciated!

JavaScript single request to two different ports

I'm testing a script where one function makes an HTTP request and is then called in another function.
The first function is:
export function getFeedData (sub) {
if (getFeedId(sub) === 2) {
return axios.get('http://localhost:4000').then((data) => JSON.parse(data));
}
}
And the second is:
export function isDelay (sub, stop) {
return getFeedData(sub).then((data) => {
return data.entity.filter((entityObj) => {
return entityObj.stop_time_update !== undefined;
});
}).then((newData) => {
console.log(newData);
}).catch((err) => {
console.log(err);
});
}
The reason they're two different functions is that the second will eventually be longer, and I wanted to separate everything out for the sake of simplicity and making my code a bit more declarative.
The tests for these functions currently look like this:
import express from 'express';
import { getFeedId, getFeedData, reverseStop, isDelay } from '../mocks/apiMock';
const app = express();
app.use(express.static('../mocks/MockData.json'));
it('returns json data', (done) => {
app.listen(4000, function () {
expect.assertions(2);
return getFeedData('L').then((data) => {
expect(data).toBeDefined();
expect(data.header.gtfs_realtime_version).toBe('1.0');
});
});
done();
});
it('returns either the delay or time until the next train' , (done) => {
app.listen(4000, function () {
isDelay('L', 'Lorimer St');
});
done();
});
That second test doesn't run because it's trying to listen on a port that's already occupied.
The solution I had in mind would be to pass app.listen() 0 as its first parameter so it listens on a random port. However, I don't know how I could get my axios request to request that specific port. Is there a way to do this? Or perhaps a better solution to my problem? Please be kind as this is my first real independent dive into creating node/express servers, and I'm trying my best to research problems on my own before posting here.
You should only use one specific HTTP server port, e.g :4000.
Then, you should setup your HTTP server like this:
const port = 4000;
let server
beforeAll((done) => {
server = app.listen(port, () => {
done();
})
})
afterAll((done) => {
server && server.close(done)
})
// Your test suites and test cases.
Then you can test your functions and they make HTTP requests to the same endpoint, e.g. http://localhost:4000.

Categories

Resources