setInterval working termination undefined - javascript

take a look at my code.
My "setInterval" sends no more than 10 requests.
And it stops working. And it always gets a response from the server.
Why does this happen and how can I fight it? I tried using axios the same problem.
setInterval(function() {
console.log('i'm here')
fetch(`/api/....`, {
method: 'POST',
headers: {
'Accept': 'application/json, text/plain, */*',
'Content-Type': 'application/json'
},
referrerPolicy: 'no-referrer'
}).then((res) => {
console.log('I'm here')
console.log(res)
}).catch((err) => {
console.log(err)
})
}, 1000);
Sometimes such an error also flies by:TypeError: NetworkError when attempting to fetch resource.

Provided your backend co-operates there is no reason why your JavaScript code should not work:
var i=0,
iv=setInterval(function() {
console.log(`I'm here: ${++i}`);
fetch(`https://jsonplaceholder.typicode.com/users`, {
method: 'POST',
headers: {'Content-Type': 'application/json'},
body: JSON.stringify({newId:i})
}).then(r=>r.json()).then(res=>{
console.log(`I'm back: ${res.newId}`);
console.log(res);
if(i==25) clearInterval(iv);
}).catch((err) => {console.log(err)});
}, 1000);
(There were a few typos in your console.logs which I straightened out.)

It's hard to said without the server side's codes, but I can guess there is some limitation in your express application which limit the numbers of your requsts to 10.
It is nothing to do with you javascript code which sould fire without any problem more then 10 requests.
As you can see in this article Rate Limiting,
const express = require("express");
const indexRoute = require("./router");
const rateLimit = require("express-rate-limit");
const app = express();
const port = 3000;
app.use(
rateLimit({
windowMs: 12 * 60 * 60 * 1000, // 12 hour duration in milliseconds
max: 5,
message: "You exceeded 100 requests in 12 hour limit!",
headers: true,
})
);
You can limit the available requests by changing the max value.

Related

CloudFlare Direct creator video upload, TUS protocol, error after 100%

I am trying to upload a video using the TUS methodology (js-tus-client) everything seems to be working well. I will post my code below to see if you can help me, please!
// /** Get one time link to upload video to cloudFlare */
router.post('/oneTimeLink', async (req, res) => {
var config = {
method: 'POST',
url: `https://api.cloudflare.com/client/v4/accounts/${process.env.CLOUDFLARE_CLIENT_ID}/stream?direct_user=true`,
headers: {
'Authorization': `Bearer ${process.env.CLOUDFLARE_KEY_ID}`,
'Tus-Resumable': '1.0.0',
'Upload-Length': '1',
'Upload-Metadata': 'maxdurationseconds NjAw',
},
};
axios(config)
.then(async function (response) {
const location = await response.headers.location
console.log(location)
res.set({
'Access-Control-Allow-Headers': '*',
'Access-Control-Expose-Headers': '*',
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': '*',
'Location': location,
})
res.send(true)
})
.catch(function (error) {
console.log(error);
});
})
In the Upload-Metadata I set maxdurationseconds NjAw, this means a maxDurationSeconds of 10 min (I am sending a 25 secs video)
That’s my node.js backend API.
Now on the frontend side, I am using the js-tus-client as follows.
import:
import * as tus from "tus-js-client";
Function:
const file = state.coursesData['Module 1'][0].content.video
var upload = new tus.Upload(file, {
endpoint: 'http://localhost:8080/dev/coursesLibraryPractitioner/oneTimeLink',
chunkSize: 250 * 1024 * 1024,
metadata: {
filename: file.name,
filetype: file.type
},
onError: function (error) {
console.log("Failed because: " + error)
},
onProgress: function (bytesUploaded, bytesTotal) {
var percentage = (bytesUploaded / bytesTotal * 100).toFixed(2)
console.log(bytesUploaded, bytesTotal, percentage + "%")
},
onSuccess: function () {
console.log("Download %s from %s", upload.file.name, upload.url)
}
})
upload.start()
The endpoint is pointing to my backend server which returns in the location header the URL.
Now, this is what happens when I trigger this function.
The backend route executes without error and the Cloudflare dashboard creates a new empty video which is great according to the docs.
In the frontend logs:
Independent on the chunksize sometimes the percent reaches 100%, but NEVER saves the video. I don’t know where that patch petition is coming from.
Please I need to have this process for the company I work for, they will acquire a more robust subscription than mine that is set right now at a 1000minutes per month for testing.
Thanks in advance, I appreciate your help!!
The last chunk at the end of your upload should be a 256KiB multiple.
Try changing 250 * 1024 * 1024 to 256 * 1024 * 1024.

NodeJs / Axios: timeout issues linked to high level of concurrency

I've set up a nodeJS server which basically receives an ad request, sends ad requests through Axios to multiple endpoints with a timeout (usually 1000 ms), and then parse and sends back all results.
On each pod around 600 requests per minute are performed to various external endpoints.
When we start the pod, requests run pretty well at the beginning, than after 1 or 2 minutes, all requests returns as timed out...
I use Promise.all to manage concurrency requests, you'll find below the adapterRequest component in charge of sending requests.
I also try to send some data within my cluster, I also get a timeout, which makes me confident on the fact that the issue is linked to Axios.
Getting into details, I create promise arrays thanks to the following module, than I use Promise.all to fetch data
const axios = require('axios');
const fakeServer = require('../test/fake-server');
let promiseCount = 0;
module.exports = (ssp, payload, endpoint, method, timeout) => new Promise((resolve) => {
const cmd = process.env.NODE_ENV === 'test' ? fakeServer : axios;
const start = Date.now();
const config = {
ssp,
url: endpoint,
method,
timeout,
header: {
'Content-Type': 'application/json; charset=utf-8',
Accept: 'application/json',
Connexion: 'keep-alive',
},
data: payload,
};
cmd(config)
.then((response) => {
promiseCount += 1;
console.log(ssp, 'RESPONSEOK', promiseCount);
resolve({
ssp,
uri: config.url,
requestbody: payload,
requestheaders: config.header,
responsebody: response.data,
status: response.status,
responsetimemillis: Date.now() - start,
});
})
.catch((error) => {
promiseCount += 1;
console.log(ssp, error.code, promiseCount);
let responsebody;
let status;
if (error.response === undefined) {
responsebody = undefined;
status = undefined;
} else {
responsebody = error.response.data ? error.response.data : error.message;
status = error.response.status;
}
resolve({
ssp,
uri: config.url,
requestbody: payload,
requestheaders: config.header,
responsebody,
status,
responsetimemillis: Date.now() - start,
});
});
});
Should I try to use such modules as agentkeepalive, with one Axios instance per endpoint?
Thanks!
Just for debugging. before you try another module...
Inside axios config could you change header into headers and add KeepAlive like this
const http = require('http')
const https = require('https')
const config = {
ssp,
url: endpoint,
method,
timeout,
headers: {
'Content-Type': 'application/json; charset=utf-8',
Accept: 'application/json',
},
data: payload,
//keepAlive pools and reuses TCP connections, so it's faster
httpAgent: new http.Agent({ keepAlive: true }),
httpsAgent: new https.Agent({ keepAlive: true }),
}

How to Fix Coinbase Pro API Request Headers?

I am trying to code to execute orders using Coinbase Pro API according to the Documentation provided. However, I got an error like this.
Access to XMLHttpRequest at 'https://api.pro.coinbase.com/orders' from origin 'http://localhost:8000' has been blocked by CORS policy: Request header field cb-access-key is not allowed by Access-Control-Allow-Headers in preflight response.
And this is the code that I wrote.
var vm = this;
var coinbasePro = {
passphrase: 'xxxxxxxxx',
key: 'xxxxxxxxxxxxxxxxxxxxxxx',
secret: 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx==',
apiURI: 'https://api.pro.coinbase.com',
};
var dataRequest = {
url: '/orders',
method: 'POST',
timestamp: Date.now() / 1000,
};
var dataBody = JSON.stringify({
price: '1.0',
size: '1.0',
side: 'buy',
product_id: 'BTC-USD'
});
var what = vm.dataRequest.timestamp + vm.dataRequest.method + vm.dataRequest.url + dataBody;
var key = Buffer.from(vm.coinbasePro.secret, 'base64');
var hmac = cryptoJs.createHmac('sha256', key);
var sign = hmac.update(what).digest('base64');
vm.$http({
url: vm.coinbasePro.apiURI+vm.dataRequest.url,
method: vm.dataRequest.method,
headers: {
'Accept': 'application/json',
'CB-ACCESS-KEY': vm.coinbasePro.key,
'CB-ACCESS-SIGN': sign,
'CB-ACCESS-PASSPHRASE': vm.coinbasePro.passphrase,
'CB-ACCESS-TIMESTAMP': vm.dataRequest.timestamp,
},
}).then((res) => {
console.log(res);
}).catch((err) => {
});
I have tried different ways to get things going and applied some of the references I have come across. Thank you for the help.
Their API does support CORS, however it is misconfigured and does not permit the security headers that they require you to use! You can work around this by running an express proxy with middleware to re-write the headers:
import express from 'express'
import { createProxyMiddleware } from 'http-proxy-middleware'
const app = express()
app.use(express.static('client'))
const apiProxy = createProxyMiddleware({
target: 'https://api.pro.coinbase.com',
changeOrigin: true,
onProxyRes: res => {
res.headers = {
...res.headers,
'access-control-allow-headers':
'Content-Type, cb-access-key, cb-access-sign, cb-access-timestamp, cb-access-passphrase',
}
},
})
app.use('/', apiProxy)
app.listen(3001)

npm start gets 401 and node app.js gets 200 response

I have a React project that I run with npm start and this code gets 401 Error from the second fetch (the first one is ok). It runs fine returning 200 only with node, like in "node App.js".
So what would I need to do to run my React project getting 200 response? Why is there this difference between npm and node to this request response?
const clientID = <ClientID>
const clientSecret = <ClientSecret>
const encode = Buffer.from(`${clientID}:${clientSecret}`, 'utf8').toString('base64')
const requestOptions = {
method: 'POST',
headers: { 'Content-Type': 'application/x-www-form-urlencoded',
'Authorization': `Basic ${encode}`,
},
};
fetch("https://auth-nato.auth.us-east-1.amazoncognito.com/oauth2/token?grant_type=client_credentials", requestOptions)
.then(response => { return response.json() })
.then(data => {
const requestOptions2 = {
method: 'POST',
mode: 'no-cors',
headers: { 'Content-Type': 'application/json',
'Authorization': `Bearer ${data.access_token}`
},
body: '{"username":"Ana", "password":"test123","user_id":"ana#email.com"}'
};
fetch('https://j1r07lanr6.execute-api.sa-east-1.amazonaws.com/v1/register', requestOptions2)
.then(response => {console.log(response)});
})
Buffer - is not presented in the browser's javascript.
Instead of
const encode = Buffer.from(`${clientID}:${clientSecret}`, 'utf8').toString('base64')
use just
const encode = btoa(`${clientID}:${clientSecret}`);
Read more about base64 encoding on MDN.
I found out it was a CORS issue that needed to be set correctly on the back-end. My workaround was disabling chrome web security and removing "mode: no-cors".
I've tried adding "Access-Control-Allow-Origin":"http://localhost:3000" to headers but it doesn't work.

Unable to get browser to store session-cookie using fetch in JavaScript

So, I've made a Node.js API using the express framework. The API supports a POST-request to /login, where the client should include email and password formatted as json in the body. The API will then return a session cookie via the setCookie-header.
I DO see the cookie coming back from the API as a response-cookie, however, the browser isn't storing it, and therefore it is not sent with further requests from the client. I've tried using {credentials: include} since this is a CORS-request. I've also added the cors-module in my node-server (API) to handle the OPTIONS (pre-flight) requests. I've used so many hours trying to figure this out, so any help would be much appreciated.
Side-note: This works completely fine in both Postman and a prototype iOS-app I've developed using the same API, so there shouldn't be any issues on the server itself.
I've included relevant code from the server and the front-end below.
Code from server:
app.use(cors({credentials: true, origin: ['http://expivider.dk', 'http://expivider.herokuapp.com', 'https://expivider.herokuapp.com', 'http://api.expivider.dk']}));
app.use(session({
cookieName: 'session',
secret: SECRET_HERE,
duration: 30 * 60 * 1000,
activeDuration: 5 * 60 * 1000,
cookie: {
// path: '/api', cookie will only be sent to requests under '/api'
maxAge: 60000, // duration of the cookie in milliseconds, defaults to duration above
ephemeral: false, // when true, cookie expires when the browser closes
httpOnly: false, // when true, cookie is not accessible from javascript
secure: false, // when true, cookie will only be sent over SSL. use key 'secureProxy' instead if you handle SSL not in your node process
path: "/"
//domain: "expivider.herokuapp.com"
}
}));
Code from front-end:
const handleRequestWithBodyWithCredentials = function (method, url, body, callback) {
fetch(url, {
method: method,
credentials: 'include',
headers: {
'Accept': 'application/json',
'Content-Type': 'application/json'
},
body: JSON.stringify(body),
mode: 'cors'
}).then((resp) => (resp.json())).then(function (data) {
callback(data);
});
};
const validate = function () {
let em = document.login.username.value;
let pw = document.login.password.value;
let body = {
'email': em,
'password': pw
};
handleRequestWithBodyWithCredentials('post', LOGIN_NEW, body, showCompanyStats);
console.log();
};
Note: Right now, the front-end is hosted on 'http://expivider.dk', and it makes calls to the api at 'http://api.expivider.dk' (which is actually hosted at 'expivider.herokuapp.com' but I'm using a custom-domain).
Please let me know if you need any more info to help me out!

Categories

Resources