I am trying, and trying, and trying , … to debug an addon that I wrote for Thunderbird 78.10.2. It sends message to a python backend and then reads the reply back:
async function buttonListener(tab) {
tabId = tab.id;
port = browser.runtime.connectNative("backendMyEd");
cD = await browser.compose.getComposeDetails(tab.id);
var body = extractBody(cD);
await port.postMessage(body);
port.onMessage.addListener(getReply);
}
The listener is just a python script that is doing:
def get_message():
if sys.version_info < (3, 0):
len_field = sys.stdin.read(4)
else:
len_field = sys.stdin.buffer.read(4)
if not len_field:
sys.exit(0)
message_length = struct.unpack('=I', len_field)[0]
message = sys.stdin.read(message_length) <=====
return json.loads(message)
This works half the time but not always. When it doesn’t, my printfs tell me that it’s stuck in stdin read, but port.postMessage() has returned. If I quit TB or remove my add on, the sys.stdin.read() finishes.
Several things come to mind:
Is there a flush needed after port.postMessage()? maybe its not flushing all the time?
could it be related to message size? i tested with different sizes, and for less than 8k it often works. For larger ones, it seems to not work more often.
Could there be some special characters in message causing this?
Something to do with JS being single threaded? But the function is async, and python should be a different process. I also tried doing a fork in python, still hangs off and on in sys.stdin.read()...
Any ideas on whats going on, or how I could debug further?
Related
Recently when i did some adventofcode, i came across a bug:
The bug happens in firefox except for if you run the code as multiple chunks, but doesnt happen in chrome.
When i run this code in the firefox console, in a tab with url https://www.google.com/robots.txt :
// Fetch input
var input = await fetch("https://www.google.com/robots.txt").then(r => r.text());
var lines = input.split("\n");
// Iterate
while(lines.length > 0) console.log(lines.shift());
It prints nothing, but when i run the code as two pieces (separated by // Iterate) it works. Note that you can change the fetch url and url of the tab, i chose robots.txt and made them the same to avoid CORS errors.
Does anyone know why this happens or how to fix this?
Edit: The code works if you wrap the code in an async function and call it, but it should work regardless.
I wrote a very simple typescript program, which does the following:
Transform users.csv into an array
For each element/user issue an API call to create that user on a 3rd party platform
Print any errors
The excel file has >160,000 rows and there is no way to create them all in one API call, so I wrote this program to run in the background of my computer for ~>20 hours.
The first time I ran this, the code stopped mid for loop without an exception or anything. So, I deleted the user rows from the csv file that were already uploaded and re-ran the code. Unfortunately, this kept happening.
Interestingly, the code has stopped at non-deterministic iterations, one time it was at i=812, another at i=27650, and so on.
This is the code:
const main = async () => {
const usersFile = await fsPromises.readFile("./users.csv", { encoding: "utf-8" });
const usersArr = makeArray(usersFile);
for (let i = 0; i < usersArr.length; i++) {
const [ userId, email ] = usersArr[i];
console.log(`uploading ${userId}. ${i}/${usersArr.length}`);
try {
await axios.post(/* create user */);
await sleep(150);
} catch (err) {
console.error(`Error uploading ${userId} -`, err.message);
}
}
};
main();
I should mention that exceptions are within the for-loop because many rows will fail to upload with a 400 error code. As such, I've preferred to have the code run non-stop and print any errors onto a file, so that I could later re-run it for the users that failed to upload. Otherwise I would have to check whether it halted because of an error every 10 minutes.
Why does this happen? and What can I do?
I run after compiling as: node build/index.js 2>>errors.txt
EDIT:
There is no code after main() and no code outside the try ... catch block within the loop. errors.txt only contains 400 errors. Even if it contained another run-time exception, it seems to me that this wouldn't/shouldn't halt execution, because it would execute catch and move on to the next iteration.
I think this may have been related to this post. The file I was reading was extremely large as noted, and it was saved into a runtime variable. Undeterministically, the OS could have decided that the memory demanded was too high. This is probably a situation to use a Readable Stream instead of a readFile.
I have a NodeJS server managing some files. It's going to watch for a known filename from an external process and, once received, read it and then delete it. However, sometimes it's attempted to be read/deleted before the file has "unlocked" from previous use so likely will fail occasionally. What I'd like to do is retry this file asap, either as soon as it's finished or continuously at a fast pace.
I'd rather avoid a long sleep where possible, because this needs to be handled ASAP and every second counts.
fs.watchFile(intput_json_file, {interval: 10}, function(current_stats, previous_stats) {
var json_data = "";
try {
var file_cont = fs.readFileSync(input_json_file); // < TODO: async this
json_data = JSON.parse(file_cont.toString());
fs.unlink(input_json_file);
} catch (error) {
console.log("The JSON in the could not be parsed. File will continue to be watched.");
console.log(error);
return;
}
// Else, this has loaded properly.
fs.unwatchFile(input_json_file);
// ... do other things with the file's data.
}
// set a timeout for the file watching, just in case
setTimeout(fs.unwatchFile, CLEANUP_TIMEOUT, input_json_file);
I expect "EBUSY: resource busy or locked" to turn up occasionally, but fs.watchFile isn't always called when the file is unlocked.
I thought of creating a function and then calling it with a delay of 1-10ms, where it could call itself if that fails too, but that feels like a fast route to a... cough stack overflow.
I'd also like to steer clear of synchronous methods so that this scales nicely, but being relatively new to NodeJS all the callbacks are starting to turn into a maze.
May be it will be over for this story, but you can create own fs with full control. In this case other programs will write data directly to your program. Just search by word fuse and fuse-binding
I'm trying to make a fun little discord chat bot with JavaScript and node.js and I'd like to put in a specific command without it affecting another one I already have set up.
She works wonderfully on all the servers I have her on, and I've got it set up so that when someone in the server says anything with "rei are", she responds with a constant from areResponses.
//const!!!
const areResponses = ["HELL yeah!", "Yep!", "I'm pretty sure that's true!", "I\'m not gonna put all the responses here because then it'd be too long..."];
//theres some other bot stuff (console log, now playing) under here but it isn't relevant until...
//the basics...
if (message.content.toLowerCase().includes("rei are")) {
var response = areResponses [Math.floor(Math.random()*areResponses.length)];
message.channel.send(response).then().catch(console.error);
}
What I want to have happen is, preferably, this command to function without setting off the "rei are" command I coded in.
if(message.content.toLowerCase().includes("rei are you happy")) {
message.channel.send("Yes, absolutely.");
}
As of right now, whenever I try to input the above command, it just triggers the "rei are" command AND the "rei are you happy" command with two messages...
else/if chains work beautifully for this actually!!!
if(message.content.toLowerCase().includes("rei do you like girls")) {
message.channel.send("Yes, absolutely. Girls are just... yeah...");
}
else if (message.content.toLowerCase().includes("rei are")) {
var response = areResponses [Math.floor(Math.random()*areResponses.length)];
message.channel.send(response).then().catch(console.error);
}
All you need to do is put the command that would overlap with the larger commands at the very bottom of the else if chain, and then you're good!!
Here is something that's been driving me crazy since an hour now. I'm working on a side project which involves accessing ElasticSearch with Javascript. As a part of the tests, I wanted to create an index. Here is a very simple snippet that, in my mind, should do this, and print the messages returned from the ElasticSearch server:
var es = require('elasticsearch');
var es_client = new es.Client({host: "localhost:9200"});
var breaker = Math.floor((Math.random() * 100) + 1);
var create_promise = es_client.indices.create({index: "test-index-" + breaker});
create_promise.then(function(x) {
console.log(x);
}, function(err) { console.log(err);});
What happens when I go to a directory, run npm install elasticsearch, and then run this code with NodeJS, is that the request is made, but the promise does not return due to some reason. I would expect this code to run to the end, and finish once the response from ES server comes back. Instead, the process just hangs. Any ideas why?
I know that an index can be created just by adding a document to it, but this weird behavior just bugged me, and I couldn't figure out the reason or the sense behind it.
By default the client keeps persistent connections to elasticsearch so that subsequent requests to the same node are much faster. This has the side effect of preventing node from closing normally until client.close() is called. You could either add this to your callback, or disable keepAlive connections by adding keepAlive: false to your client config.