Passing data between two plain js files in ejs templating - javascript

my file structure is as follows:
file structure
I have two ejs views. I am taking a variable from index.ejs using document.querySelector. This variable is stored in index.js file.
I need to access this variable in board.js
How can I do so?
I have tried using:
module.exorts = varName and then require in board.js but it isn't working
index.js file
const btn = document.querySelector(".level1");
var levelMode;
btn.addEventListener("click", () => {
levelMode = btn.innerHTML;
alert(levelMode);
});
module.exports = levelMode;
board.js file
var levelMode = require("./index")
The console shows the following error:
uncaught reference: require is not defined

Im a new developer so take my advice with a grain of salt, but I feel I need to start by saying that js files dont "store" information that isn't the source code. Any information that you feel you need to pull from a variable is stored by the browser running the js file. If you need to pull data from the client my best advice is to first pass it to the server. You should get more familiar with the module you are using for HTTP Requests.

Related

Accessing keepass databass returns KdbxError: Error BadSignature

I am using the kdbxweb library.
My goal is to open a kdbx database file, and then retrieve a password from it.
Following the example on the page, and also inspired by some things I saw in the keepass code, which uses this lib, I came up with this:
const password = kdbxweb.ProtectedValue.fromString('secret');
const credentials = new kdbxweb.Credentials(password);
const file = kdbxweb.ByteUtils.arrayToBuffer(
kdbxweb.ByteUtils.base64ToBytes('/home/chai/code/Kairos/src/e2e/db.kdbx'),
);
const db = await kdbxweb.Kdbx.load(file, credentials);
Sadly when I run it it gives me : Error | KdbxError: Error BadSignature
The file and password are correct; I verified that using the keepass application, which will open it without issue.
Any ideas are welcome! Thx!
Okay I found out what the problem was. It was in the structure of the keepass kdbx file. It was generated by importing a csv, but somehow all entries were directly under root. This gave me errors. Now restructuring it with a Group "db" (as per default) and then putting the entries under that solved the issue.

Cannot write to JSON in Nodejs

I'm trying to make a live JSON database of IDs and a tag. The Database refreshes by reading from the JSON file and I have traced my problem to Nodejs not writing to disk, and I don't quite know why.
This is my reading operation, and yes there is a file there with proper syntax.
let dbraw = fs.readFileSync('db.json');
var db = JSON.parse(dbraw);
This is my writing operation, where I need to save the updates to disk.
var authorid = msg.author.id
db[authorid] = "M";
fs.writeFileSync('db.json', JSON.stringify(db));
Am I doing something wrong? Is it a simple error I am just forgetting? Is there an easier/more efficient way to do this I am forgetting about? I can't seem to figure out what exactly is going wrong, but it has something to do with these two bits. There are no errors in my console, just the blank JSON file it reads every time on the Read Operation.
There is a problem with your JSON file's path.
Try using __dirname.
__dirname tells you the absolute path of the directory containing the currently executing file.
— source (DigitalOcean)
Example:
If the JSON file is in the root directory:
let dbraw = fs.readFileSync(__dirname + '/db.json');
var db = JSON.parse(dbraw);
If the JSON file is in a subdirectory:
let dbraw = fs.readFileSync(__dirname + '/myJsonFolder/' + 'db.json');
var db = JSON.parse(dbraw);
Side note: I suggest you read about Google Firestore, as it will be a faster way to work with real time updates.
Here's a simple block that does what is desired
const fs = require('fs');
let file_path = __dirname + '/db.json',
dbraw = fs.readFileSync(file_path),
db = JSON.parse(dbraw),
authorid = 'abc';
console.log(db);
db[authorid] = "M";
fs.writeFileSync(file_path, JSON.stringify(db));
dbraw = fs.readFileSync(file_path), db = JSON.parse(dbraw)
console.log(db);
I've added a couple of log statements for debugging. This works and so there may be something else that's missing or incorrect in your flow. The most probable issue would be that of different path references as pointed out by jfriend00 in the comment to your question.
As for better solutions, following are a few suggestions
use require for the json directly instead of file read if the file is small which will do the parsing for you
Use async fs functions
Stream the file if it's big in size
See if you can use a cache like redis or database as a storage means to reduce your app's serialization and deserialization overhead

How to access or create File objects using Mocha (NodeJS)?

I'm a beginner in JavaScript and am creating my first project. I made a File Upload program which works, and I am now adding unit tests. I am trying to test how my program handles various different types of files (valid and invalid CSV/JSON files). However, my FileUploadHandler is expecting a File object and Mocha, which uses NodeJS, does not have a File class, so I cannot call new File to create File objects to pass in.
I tried to make a bunch of files manually and then pass them into the Mocha unit tests by path but I do not know how to get them to be passed in as File objects. -- I looked it up and it seems that you cannot access File objects by path via JS so if this is the case, what is a way I can do my unit testing?
I hope this was clear but if not, I can further clarify. Please help and thank you so much
** my function that I want to test takes in a File object
Ex.
describe('File testing', function() {
it('cannot process non csv/jsons', function() {
let f = new File([""], "html_file.html");
let tester = new FileUploadHandler(f);
## expecting error ##
})
})
In my implementation I use, mocha and chai to make expectations on desired inputs, here is an example of a failing test when user try to upload two files in one html form (this is not allowed by my multer rules, it is configured for single upload).
it('Check can not post more than 1 file', function(done) {
api.post('/files')
.set('Accept', 'application/json; charset=utf-8')
.field('Content-Type', 'multipart/form-data')
.field("sequence", 1)
.field("media", 10)
.attach("image", 'test/files/edit.png')
.attach("image", 'test/files/edit.png')
.expect(500)
.end(function(err, res) {
expect(res.status).to.equal(500);
done();
});
});
I just declared my form inputs, and a field type called image, that is the data holder for the image.

MEAN stack how to send a user a file

I have some code on the express controller that looks like this:
// Correct file path and write data
var currentDate = new Date();
var storagePath = path.join(__dirname,'../../public/reports/', 'Risk-Log.csv');
var source = es.readArray(riskLogs);
source.pipe(jsonCSV.csv(options)).pipe(fs.createWriteStream(storagePath));
console.log('Completed csv export to ' + storagePath);
// Send file back
res.sendFile(storagePath);
In my angular view (log.client.view.html) I have the following:
Download CSV
When I click that button... The file gets generated properly, but the file never gets sent back to the user (that they can tell).
If I look at the console debugger I get the following:
Resource {$promise: Object, $resolved: false, $get: function, $save: function, $query: function…}
I haven't done any routing with Angular or anything special since it's hitting the Node (express) controller and generating the file. I'm wondering if this is something that I should be doing in Angular?
What I'm trying to achieve is that when the user clicks that button the CSV downloads.
Any help would be MUCH appreciated.
Since using express res.send() is a buffer.... you have to send in the content type like so:
res.set('Content-Type', 'text/csv');
res.set('Content-Disposition', 'attachment');
res.send(csv); // csv object created with json-csv npm module
Big thanks to all those who commented on this!

How do I read and write files to the server with Meteor?

I'm working on a NoDB CMS in Meteor, but I'm new to both Meteor and JavaScript frameworks.
How do I go about reading and writing files to the server?
Within the Node fs module you have a writeFile function.
getUser = Meteor.users.findOne({_id : Meteor.userId()});
userObject = JSON.stringify(getUser);
var path = process.env["PWD"] + "/public/";
fs.writeFile(process.env["PWD"] + "/public/"+Meteor.userId()+'.txt', userObject,
function (err) {
if (err) throw err;
console.log('Done!');
}
);
The above snippet would create a file with all the information of the user. You could access the properties of the result of your query with something like getUser._id to prepare your data parameter (String or Buffer) to print pretty.
All this of course is server side.
you can try to use Npm.require inside the startup function. Like so
Meteor.startup(function () {
fs = Npm.require('fs');
}
But you should definitely have a look at collectionFS that does what you are looking for: storing files on the server and allowing you to retrieve them
an added advantage is that you can distribute everything over many nodes of a MongoDB cluster
to manipulate image files, you can use imagemagick with nodejs this should allow you to transform in any way you need.
The node fs module is a start. http://nodejs.org/api/fs.html
You might want to be a bit more specific with your question though, as it's kind of broad.

Categories

Resources