We are experiencing an issue while writing files using Node.js in our web application. Node.js is installed using IISNode.
Everything is working fine on our dev server, but same code is giving problem on our production server (Windows 7, IIS7, Windows Server 2008 R2 Standard, Service pack 1).
Basically, on a button click, multiple HTML files are created by node and content is written in them. The problem is that content is not written to all the created files. All files are created but some of the files are left blank with no content.
Here is an excerpt of code that is being used to create and write to files:
const $ = cheerio.load(cssData + videoHtml + pageData.content);
let indexFile = fs.openSync(file, 'w+')
fs.writeFileSync(indexFile, $.html(), { encoding: 'utf8' })
fs.closeSync(indexFile)
const $ = cheerio.load(cssData + videoHtml + pageData.content);
fs.writeFileSync(file, $.html(), { encoding: 'utf8' , mode : 0755})
fs.closeSync(indexFile)
Try adding mode in options
Related
im trying use the chrome driver, with selenium in the firebase cloud functions.
while deploying the index.js file on to a local host, using the terminal cmd
'firebase serve --only functions' from the functions folder(which has the chromedriver file, every thing work like in supposed to, im getting a response.
But,
when I deploying the index.js file to the firebase real servers using the
'firebase deploy --only functions' ,
and then trigger the function from my app, or from a url,
im getting an error in the firebase cloud functions logs telling me
'Function execution took 1774 ms, finished with status: 'crash' '
so since there were not details about what was causing this 'crash' I uses the consoled.log() function, and printed each line that was executed successfully.
this lead me into the line of
let driver = await new Builder().forBrowser('chrome').setChromeOptions(new chrome.Options().headless().windowSize(screen)).build();
its seems like in the firebase cloud function I can't for some reason create a chrome driver successfully , although in the local host version, it's working as expected.
here the full code of the function:
exports.initializedChromeDriver = functions.https.onRequest((request, response) => {
async function start_chrome_driver() {
functions.logger.info('Hello logs!', {structuredData: true});
console.log("did enter the function")
const dic = {};
const google_site = "https://www.gooogle.com";
const { WebDriver } = require('selenium-webdriver');
const {Builder, By} = require('selenium-webdriver');
console.log("did creat WebDriver,Builder, By, constans")
const chrome = require('selenium-webdriver/chrome');
console.log("chrome constans was created")
const screen = {
width: 1024,
height: 1024
};
let driver = await new Builder().forBrowser('chrome').setChromeOptions(new chrome.Options().headless().windowSize(screen)).build();
console.log("driver was set");
await driver.get(google_site);
console.log("succ loading google");
return "succ loading google"
}
const p = start_chrome_driver().then((value,reject) => {
dic['status'] = 200;
dic['data'] = {"message": value};
response.send(dic);
});
and here are the logs in the firebase cloud console:
UPDATE
well after searching for answers in the web, ,i found that I didn't have a suitable 'driver' for linux systems,
so I replace the chrome driver for mac(which work on my machine, on the local host version),
with a chrome driver, for linux. and try again, well that doesn't work either , but at lest I was getting a new log with an Error, not just a 'crash' , here it his:
so now I know that I need to install the chrome browser binary for linux on the project,
any ideas to how can I do this, Im trying to install the chrome binary driver on to my project in firebase, there a few ways im trying now, like installing the chrome binary using the google shell some how, any help or idea to I do does, even so, I don't sure that after the binary will be install , this will be it, and the driver will work on the firebase servers..
Preamble
To start off, I'm not a developer; I'm just an analyst / product owner with time on their hands. While my team's actual developers have been busy finishing off projects before year-end I've been attempting to put together a very basic API server in Node.js for something we will look at next year.
I used Swagger to build an API spec and then used the Swagger code generator to get a basic Node.js server. The full code is near the bottom of this question.
The Problem
I'm coming across an issue when writing out to a log file using the fs module. I know that the ENOENT error is usually down to just specifying a path incorrectly, but the behaviour doesn't occur when I comment out the Swagger portion of the automatically generated code. (I took the logging code directly out of another tool I built in Node.js, so I'm fairly confident in that portion at least...)
When executing npm start, a few debugging items write to the console:
"Node Server Starting......
Current Directory:/mnt/c/Users/USER/Repositories/PROJECT/api
Trying to log data now!
Mock mode: disabled
PostgreSQL Pool created successfully
Your server is listening on port 3100 (http://localhost:3100)
Swagger-ui is available on http://localhost:3100/docs"
but then fs throws an ENOENT error:
events.js:174
throw er; // Unhandled 'error' event
^
Error: ENOENT: no such file or directory, open '../logs/logEvents2021-12-24.log'
Emitted 'error' event at:
at lazyFs.open (internal/fs/streams.js:277:12)
at FSReqWrap.args [as oncomplete] (fs.js:140:20)
Investigating
Now normally, from what I understand, this would just mean I've got the paths wrong. However, the file has actually been created and the first line of the log file has been written just fine
My next thought was that I must've set the fs flags incorrectly, but it was set to 'a' for append:
var logsFile = fs.createWriteStream(__logdir+"/logEvents"+dateNow()+'.log',{flags: 'a'},(err) =>{
console.error('Could not write new Log File to location: %s \nWith error description: %s',__logdir, err);
});
Removing Swagger Code
Now here's the weird bit: if I remove the Swagger code, the log files write out just fine and I don't get the fs exception!
This is the specific Swagger code:
// swaggerRouter configuration
var options = {
routing: {
controllers: path.join(__dirname, './controllers')
},
};
var expressAppConfig = oas3Tools.expressAppConfig(path.join(__dirname, '/api/openapi.yaml'), options);
var app = expressAppConfig.getApp();
// Initialize the Swagger middleware
http.createServer(app).listen(serverPort, function () {
console.info('Your server is listening on port %d (http://localhost:%d)', serverPort, serverPort);
console.info('Swagger-ui is available on http://localhost:%d/docs', serverPort);
}).on('error',console.error);
When I comment out this code, the log file writes out just fine.
The only thing I can think that might be happening is that somehow Swagger is modifying (?) the app's working directory so that fs no longer finds the same file?
Full Code
'use strict';
var path = require('path');
var fs = require('fs');
var http = require('http');
var oas3Tools = require('oas3-tools');
var serverPort = 3100;
// I am specifically tried using path.join that I found when investigating this issue, and referencing the app path, but to no avail
const __logdir = path.join(__dirname,'./logs');
//These are date and time functions I use to add timestamps to the logs
function dateNow(){
var dateNow = new Date().toISOString().slice(0,10).toString();
return dateNow
}
function rightNow(){
var timeNow = new Date().toTimeString().slice(0,8).toString();
return "["+timeNow+"] "
};
console.info("Node Server Starting......");
console.info("Current Directory: " + __dirname)
// Here I create the WriteStreams
var logsFile = fs.createWriteStream(__logdir+"/logEvents"+dateNow()+'.log',{flags: 'a'},(err) =>{
console.error('Could not write new Log File to location: %s \nWith error description: %s',__logdir, err);
});
var errorsFile = fs.createWriteStream(__logdir+"/errorEvents"+dateNow()+'.log',{flags: 'a'},(err) =>{
console.error('Could not write new Error Log File to location: %s \nWith error description: %s',__logdir, err);
});
// And create an additional console to write data out:
const Console = require('console').Console;
var logOut = new Console(logsFile,errorsFile);
console.info("Trying to log data now!") // Debugging logging
logOut.log("========== Server Startup Initiated ==========");
logOut.log(rightNow() + "Server Directory: "+ __dirname);
logOut.log(rightNow() + "Logs directory: "+__logdir);
// Here is the Swagger portion that seems to create the behaviour.
// It is unedited from the Swagger Code-Gen tool
// swaggerRouter configuration
var options = {
routing: {
controllers: path.join(__dirname, './controllers')
},
};
var expressAppConfig = oas3Tools.expressAppConfig(path.join(__dirname, '/api/openapi.yaml'), options);
var app = expressAppConfig.getApp();
// Initialize the Swagger middleware
http.createServer(app).listen(serverPort, function () {
console.info('Your server is listening on port %d (http://localhost:%d)', serverPort, serverPort);
console.info('Swagger-ui is available on http://localhost:%d/docs', serverPort);
}).on('error',console.error);
In case it helps, this is the project's file structure . I am running this project within a WSL instance in VSCode on Windows, same as I have with other projects using fs.
Is anyone able to help me understand why fs can write the first log line but then break once the Swagger code gets going? Have I done something incredibly stupid?
Appreciate the help, thanks!
Edit: Tried to fix broken images.
Found the problem with some help from a friend. The issue boiled down to a lack of understanding of how the Swagger module works in the background, so this will likely be eye-rollingly obvious to most, but keeping this post around in case anyone else comes across this down the line.
So it seems that as part of the Swagger initialisation, any scripts within the utils folder will also be executed. I would not have picked up on this if it wasn't pointed out to me that in the middle of the console output there was a reference to some PostgreSQL code, even though I had taken all reference to it out of the main index.js file.
That's when I realised that the error wasn't actually being generated from the code posted above: it was being thrown from to that folder.
So I guess the answer is don't add stuff to the utils folder, but if you do, always add a bunch of console logging...
Goal
My goal is to display my IP cam's RTSP-output stream on a standard HTML-page (html5 + css3 + vanilla javascript, no magic = no plugins). The HTML-page should be hosted in a NGINX web server on my Raspberry Pi.
My equipment
The setup I am using is a Raspberry Pi 3 B+ with Rasbian OS, Node.js and Node-Media-Server package, NGINX (but I do not believe that NGINX is important for my problem? I have not made any config for the Node-Media-Server in it anyway.) An IP-camera, and a browser.
What I have tried
The readme in the Node-Media-Server-project is detailed and there is a tutorial describing almost exactly what I want to do. Specifically, there is a markup example on how the live stream could be accessed:
<html>
<head>
<title>Camera</title>
</head>
<body>
<script src="https://cdn.bootcss.com/flv.js/1.4.0/flv.min.js"></script>
<video id="videoElement"></video>
<script>
if (flvjs.isSupported()) {
var videoElement = document.getElementById('videoElement');
var flvPlayer = flvjs.createPlayer({
type: 'flv',
url: 'http://localhost:8000/live/uterum.flv'
});
flvPlayer.attachMediaElement(videoElement);
flvPlayer.load();
flvPlayer.play();
}
</script>
</body>
</html>
This is how I start the media server on my Raspberry PI, kommandoran-mediaserver.js:
const { NodeMediaServer } = require('node-media-server');
const config = {
logType: 3, // 3 - Log everything (debug)
rtmp: {
port: 1935,
chunk_size: 60000,
gop_cache: true,
ping: 60,
ping_timeout: 30
},
http: {
port: 8000,
allow_origin: '*'
},
relay: {
ffmpeg: '/usr/local/bin/ffmpeg',
tasks: [
{
app: 'cctv',
mode: 'static',
edge: 'rtsp://<USER>:<PASSWORD>#10.0.0.111/live1.sdp',
name: 'uterum',
rtsp_transport : 'tcp' //['udp', 'tcp', 'udp_multicast', 'http']
}
]
}
};
var nms = new NodeMediaServer(config)
nms.run();
My problem and question
When I try to view camera.html (see markup above) via the Chromium browser on Raspberry Pi (i.e. local host), nothing is displayed. In the Chromium debug inspector there are no javascript errors, but I get this:
GET http://localhost:8000/live/uterum.flv net::ERR_EMPTY_RESPONSE
Here is a screenshot from the node terminal:
The red area illustrates the output when I try to make a request to http://localhost:8000/live/uterum.flv.
I suppose I try to reach the wrong endpoint but which is correct? The documentation states http://localhost:8000/live/STREAM_NAME.flv. What is "STREAM_NAME" in my case?
As you can see from the configuration, your RTSP stream is pushed to the ‘cctv’ application.
So your playback address should be:
rtmp://localhost/cctv/uterum
or
http://localhost:8000/cctv/uterum.flv
I have node file that is running a karma test in a node app using the karma public api (I'll save writing out the code because it comes straight from http://karma-runner.github.io/0.13/dev/public-api.html).
All is fine so far, the test runs. Now I need to start serving different files to the karma run. For example, I might have exampleSpec.js, example1.js, example2.js, and example3.js. I need to serve exampleSpec and then example1-3 in sequence.
However, I don't see any documentation on this, and can't seem to get anywhere on.
So, The answer ended up being pretty simple. The first argument to the server constructor is a config object, that can replace or augment the karma.conf.js, so it is posible to send in altered files arrays. Code below for posterity:
"use strict";
var Server = require('karma').Server;
var filePath = process.cwd();
filePath += "/karma.conf.js";
console.log(filePath);
//files needs to be an array of string file matchers
function runTests(files, port) {
var config = {
configFile: filePath,
files: files,
port: port
};
var server = new Server(config, function(exitCode) {
console.log('Karma has server exited with ' + exitCode);
process.exit(exitCode)
});
server.on('run_complete', function (browser, result) {
console.log('A browser run was completed');
console.log(result);
});
server.start();
}
runTests(['test/**/*Spec.js', 'tmp/example.js'], 9876);
runTests(['test/**/*Spec.js', 'tmp/example2.js'], 9877);
runTests(['test/**/*Spec.js', 'tmp/example3.js'], 9878);
I'm trying to use a Socket connection to read a file on a remote website. So far, my code:
conn = new Socket;
if( conn.open( 'example.com:80' ) ) {
conn.write( 'GET /indesign-page/ HTTP/1.0' + "\n\n" );
reply = conn.read(999999);
conn.close();
} else {
alert( 'Problem connecting to server' );
}
The socket connects to example.com fine, but the request comes across as this:
GET http://localhost/indesign-page/ HTTP/1.0
when it should be this:
GET http://example.com/indesign-page/ HTTP/1.0
I've tried changing the conn.write parameters to 'GET http://example.com/indesign-page/ ...', but then it comes across as:
GET http://localhosthttp://example.com/indesign-page/ HTTP/1.0
The webserver requires that the host be set correctly to serve correctly.
You need to set the "Host" header.
conn.write( 'GET /indesign-page/ HTTP/1.0' + "Host: example.com\r\n" + "\n\n" );
Because conn.open( 'example.com:80' ) means find example.com's server ip then connect that ip address at 80 port, so the web server does not know that you had resolved example.com before connected to it.
Do you need to use a manual socket object? On Adobe's Community Site there's a mention to this already created FTP Script where you could call a GET or PUT to a file on a FTP server.
Otherwise which OS are you using? If you'll always be on a Mac, you could shell out to an AppleScript command and place the file anywhere you'd like:
var command = 'do shell script "curl http://localhost/indesign-page"';
var response = app.doScript(command, ScriptLanguage.APPLESCRIPT_LANGUAGE);
The nice thing about the AppleScript is that you can execute the command manually using the AppleScript Editor (or Script Editor if you're earlier than 10.6).