I need to read a JSON file located on my machine with my react application and nothing seem to work.
I tried importing fs (with import and require) but it returns a blank object and I cannot use any of it's functions.
Jquery doesn't seem to find my file either.
Everywhere I searched they use FS, Jquery or FileReader (the last one is always used for files that the client uploaded to the page).
This is one of the solutions I tried with FS:
const fs = require("fs");
export default function getJson() {
let rawdata = fs.readFile("file.json");
let json = JSON.parse(rawdata);
console.log(json);
}
When I reload the browser
TypeError: fs.readFile is not a function
With Jquery I tried this:
import $ from "jquery";
export default function getJson() {
$.getJSON("file.json", function() {
console.log("success");
});
}
The console shows this:
Failed to load resource: the server responded with a status of 404 (Not Found)
I hope you can help me.
Thanks in advance!
fs is an inrospection that connects Node's engine to the underlying OS. It does not exist in browser code (e.g. ReactJS).
For security measures, browsers deny JS from interacting with the underlying platform. So to read the file, you have two option (as far as I know):
Either create a file-input field in html (like the file-upload fields you usually see). Or you need to serve the file by some server. You can put the file in your assets folder, and then request it in react.
Update 1:
From my experience, one of the easier ways to include a "runtime-configuration" is to serve the configuration as an asset.
I don't have much experience in ReactJS, but the idea is the same.
Let's say you put the config.json in Public/json/config.json. Then you can read this file from the application by requesting it.
here's an example:
fetch("/json/config.json").then(resp=>resp.json()).then(console.log);
and here's a small working stackblitz example
Related
I have a folder structure like this
index.html
file.txt
in the index.html I have a script that wants to read the file.txt and assign it to a variable, say x.
async function loadFile(file) {
let text = await file.text();
console.log(text);
return text
}
var x = loadFile("file.txt")
but this returns an error
main.js:35
Uncaught (in promise) TypeError: file.text is not a function
at loadFile (main.js:35:27)
at main.js:39:13
what to do? I just want to assign file.txt content to a variable.
JavaScript in browser doesn't give you direct access to file system.
What you did is just pass a string to your function and call text() on it. Sadly it doesn't work like that.
Here are two workarounds I know of:
1. fetch api
If you have an http server serving your files then you make a request using fetch() to obtain its content in JS. Read the docs here :
https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API/Using_Fetch
Note that it won't work on pages loaded with file:// URI scheme. You need http(s). If you have python installed run python -m http.server 8000 in directory containing files to be served.
2. FileReader
If the file is to be selected by user as input to a form then you can use FileReader
Have a look these links:
webdev has great explaination for this
https://web.dev/read-files/
also see the docs
https://developer.mozilla.org/en-US/docs/Web/API/FileReader
Sorry for the wording in the question. Probably my biggest issue with this is not knowing how to phrase it correctly, as I've not been able to gleam a single hint of an answer from google.
Using api routes in Next.js I want to serve a data.json file. This works no problem, however I also want to be able to edit this file afterwards and for the api to reflect the updated content. As it stands after building and running if I edit the file the api still returns the old one, hell I can even delete it. I assume this is because Next.js makes a copy of the file at build time, and puts it somewhere in the .next directory(?), haven't been able to find it there though.
Below is a boiled down version of what I'm doing:
/pages
/api
/info.js
/data
/data.json <- file I want to use with the api
pages/api/info.js
export default function (req, res) {
const data = require('../../data/data.json')
// Do stuff
res.status(200).json(data.something)
}
Any guidance on this is very appreciated
Using require to include a file in any Node app will definitely tie the json file to the apps run time or build time.
The feature you describe sounds like static file serving but next caches those files as well.
Try reading the file in the API instead
const fsp = require('fs').promises
export default async function (req, res) {
try {
const file_data = await fsp.readFile('../../data/data.json')
const json_data = JSON.parse(file_data)
// Do stuff
res.status(200).json(data.something)
}
catch (error) {
console.log(error)
res.status(500).json({ error: 'Error reading data' })
}
}
Actual Static files
In development you can probably work around this by triggering rebuilds when static files update.
If you want update the data files independently of a built app, they will probably need to be hosted separately to the Next build. If you are next exporting a completely static site, you might have a place to put the static files already. If not you can serve the data files with another node process, or a web server, or something like S3.
I'm creating an Electron app and in the process am trying to use some existing javascript and other files (css and other resources). My code is spread out across a number of packages, each package containing various of these files. Outside of Electron these files would be served from a server which provides a mapping from a flat list of files to the paths to each of these files and I am trying to implement similar "server-side" functionality in Electron's "back end", if there is such a thing.
Since Electron is getting these files from the file:// protocol it is not finding most of these files because everything is resolving relative to the path of the current javascript file and these files do not know about each other and thus cannot specify a hard-coded path.
Is there some mechanism in Electron to hook requests for files so that I can provide the path for it to look at? In my server-side code I do something where I have an object which maps file names to the paths they are in, and I feel like the solution would similarly be to intercept requests and tell Electron where to look for each file.
I found this question but the solution offered there won't work for me because my web app is a bit too dynamic and the requests are coming from deep in the code, not some user interface element I could trap.
You can accomplish this by intercepting the file protocol handler. If you have set up your file mappings into an object like so:
files = {
"file1.js": "/path/to/file1.js",
"file2.js": "/path/to/file2.js",
// etc.
}
Then in the createWindow function you will insert this code immediately after you instantiate the new BrowserWindow:
protocol.interceptFileProtocol("file", (req, cb) => {
var file = req.url.split("/")
file = file[file.length-1]
if (files[file]) {
console.log(`intercepting ${file} => ${files[file]}`)
cb({path:files[file]})
}
})
Note: The protocol references a const that you get from requiring electron, e.g. something like this:
const {app, BrowserWindow, protocol} = require("electron")
This code assumes that the file names are unique and are the only part of the path that matters. So for instance, not matter what path the code thinks "file1.js" is in, in the above example it will be redirected to /path/to/file1.js. If the requested file doesn't exist then the behavior is undefined and probably nothing will load.
I have been given 100+ JSON files which I need to display locally in a react app. I'm able to load one file at a time using the fetch() function, but I'm not sure how to load all of the files.
I've considered getting a list of all of the files and then doing a fetch() on the list, but the issue is that I cannot access the list of files in the directory.
I read that I could use fs but it seems like that won't work in the browser. ex: I've tried:
var fs = require('fs');
var files = fs.readdirSync('../app/components/data/');
but this throws the error: fs.readdirSync is not a function. I'm open to different approaches.
If the files are small, one option would be to merge them all into one large JSON array in one file and fetch() that. If you don't mind load times taking a bit of a hit, you could even import or require() the JSON file from your application code, including its contents in your JS bundle.
However, if the files are big, you're probably better off creating a 'manifest' file which describes the contents and locations of the other files. It wouldn't be too hard to write a script to store all the files in that directory in an array in an index.json. From there, you could fetch() the index from the browser, and then fetch() each file individually.
Is it possible to add a file into the website's directory?
In other words, let's say I have a directory – as an example I'll call it "myWeb"– it has index.html and main.js.
In main.js, I want to add a new page to myWeb when I click a button, let's say secondindex.html (for now, I won't worry about overwriting the file when clicked again).
Is this possible? If not, are there other techniques, or is this too broad?
For security reasons, the client-side cannot directly write to the server-side as they're not connected with one another. If writing a file to the server is what you want, you'll need a backend server and some kind of API/script that you interact with and create files that way.
There are other ways that don't involved creating files, such as using a database or cloud solution like Firebase. However, there are perfectly valid reasons you might want to keep your site basic and write static files (speed being one of them) and not wanting to setup and maintain a database being another.
This can be done using the fs module in Node.js for example:
var fs = require('fs');
function generateHtml (req) {
return '<!DOCTYPE html><html><header><title>My Page</title></header><body><p>A test page</p></body></html>';
}
var filename = '/path/to/secondindex.html';
var stream = fs.createWriteStream(filename);
stream.once('open', function(fd) {
var html = generateHtml();
stream.write(html);
stream.end();
});
The above would be in a Javascript file server-side and then you would have some kind of server checking for a request (preferably this would be protected using a token or something) and creating the needed file.
The fs module in particular is flexible in that you can create files many different ways. The above uses streams and is great for when you're dealing with potentially massive files.