Transfer big data to php server-script - javascript

Currently I´m creating a little IDE in Javascript on my webpage and I´m struggling at the concept of the saving process. For the beginning I want to save the whole code written in my IDE to be saved in a .txt.
So far I know how to pass values from client-JS to server-PHP with HTTP-request, but my case I basically have to transfer a very long string. From within the php script I want to create/manipulate a .txt to save the data outside of the root of the webserver.
So I was wondering if there is a good solution to pass a lot of data at once to my php running on the server.

Do you mean something like this?
https://www.w3schools.com/php/php_file_create.asp

Related

Is it possible to have a database under MySQL export rows into managed folders accessible to the web?

Is it possible to have a database under MySQL export rows into managed folders accessible to the web with the contents of each column be it's own text file OR json file?
To be honest, I can't find anywhere on how to do this?
My question seems to be quite niché.
Ill break down my problem.
lets say I have a database, with a table called "registry"
under registry, I would have 4 columns.
for example:
username nickname dob timestamp
bob2414 bobby 03211989 201019
sarah83 sars 10162002 231019
masterc carlo 07271997 261019
blahhbh umomi 03241999 281119
is there a way, upon a new entry, to trigger either a JS bot, or any kind of automation to create a structure on the web that could be called upon, for example.
domain.com/registry
bob2414/
nickname.txt
dob.txt
timestamp.txt
(or)
bob2414.json
sarah83/
nickname.txt
dob.txt
timestamp.txt
(or)
sarah83.json
masterc/
nickname.txt
dob.txt
timestamp.txt
(or)
masterc.json
and so on?
is there a page i can view, or any video explanation online?
Are there any better ways I can do this?
The application of my problem lies within creating these directories, as I don't want one huge json file hanging around, because i plan on these files being downloaded and accessed, and want smoother download times then managing a huge database.
The same would also apply to after I delete a row, would it manage to be deleted on its own?
please, and thank you. Ill help any way I can!
Answering my own question with a personal thought experiment.
In theory it would be easier to design an element in PHP to query the database with a GET request, and have data presented in JSON format, than to worry about managing files and directories with a java or javascript applet

Is it possible to overwrite a CSV file on a javascript event fired on the front end?

I would like to know the feasibility of doing the following:
Fire a JavaScript event on the front end that changes a variable within a CSV (possibly by simply overwriting that file), then
Run a python script on the Django backend which reads that CSV and outputs to another CSV (my understanding is that AJAX works for this), then
Read the output with JavaScript and used on the front end.
Is this a possible event loop, particularly given the asynchronous issues presented? So far I've been able to accomplish step 3 using jquery-csv, and I believe it's possible to run a python script with an AJAX call, making step 2 seem pretty doable. But is it possible to look into an input csv, grab a particular variable I want to change, and then change it?

Using JSON file to load events to FullCalender. Will this work for multiple users?

Here is how I am getting event information from my database into FullCalender, using PHP:
Query database for event information.
Place that information into an array and make my formatting edits, add colors, whatever.
Use json_encode to put array into JSON format.
Write the file to my server as "results.json".
Then in my javascript I use this file to fill my Calendar object:
$('#calendar').fullCalendar({
events: 'results.json'
});
So that all works great.
Here is my concern:
What happens when I have multiple users?
Jim is going to query the database for his events.
Those events are going to be written to results.json.
At the same time, Sue may open the page and query the database for her events.
The code is going to overwrite results.json.
Who knows what events are going to show up on their calendar!
I see some suggestions about using socket.io, but there are other articles suggesting that people should be moving to use WebSockets. And I understand that these are supposed to help with real-time applications.
But I'm not following a chat session that is being updated real-time. I have many users that are accessing their own data. Does that mean that every user needs to have their own JSON file? That seems... yucky. Should this file be saved to their local device? That seems full of permission issues.
Any advice?
Many thanks to Roljhon and Archer, who pointed out that I'm asking the wrong question here.
The short answer to the question above is:
NO, you should not be saving a file to the server with data that is going to change and be different for each user. There is a better way.
The REAL question is, once I have my data in a PHP variable, how do I get it to JavaScript?
There is a very good explanation here: How to pass variables and data from PHP to JavaScript?
This explains how to use AJAX, or use the DOM in Javascript, or simply echo your PHP variable into a Javascript variable. (Did NOT know you could do that.)

Converting an XML file into an object

I'm trying to use the Planetside 2 database to create a simple application for educational purposes. However, I'm stuck. I need to convert an XML file to a Javascript object so i can use the data easily and display it on my app. Problem is, the file is a link, i suppose. I tried it just like you would with a file located on the same server/PC however that doesnt work.
This is the link:
http://census.daybreakgames.com/xml/get/ps2:v2/character/?name.first_lower=litebrite
My question is, How do i convert/turn this onto an object usable in Javascript? (I have absolute no experience in javascript, hence the reason i'm not using the JSON verion of the file)
Thanks

How to attach large amounts of data with tampermonkey script?

My script adds some annotations to each page on a site, and it needs a few MBs of static JSON data to know what kind of annotations to put where.
Right now I'm including it with just var data = { ... } as part of the script but that's really awkward to maintain and edit.
Are there any better ways to do it?
I can only think of two choices:
Keep it embedded in your script, but to keep maintainable(few megabytes means your editor might not like it much), you put it in another file. And add a compilation step to your workflow to concatenate it. Since you are adding a compilation you can also uglify your script so it might be slightly faster to download for the first time.
Get it dynamically using jsonp. Put it on your webserver, amazon s3 or even better, a CDN. Make sure it will be server cachable and gzipped so it won't slow down the client network by getting downloaded on every page! This solution will work better if you want to update your data regularly, but not your script(I think tampermonkey doesn't support auto updates).
My bet would would definetly be to use special storage functions provided by tampermonkey: GM_getValue, GM_setValue, GM_deleteValue. You can store your objects there as long as needed.
Just download the data from your server once at the first run. If its just for your own use - you can even simply insert all the data directly to a variable from console or use temporary textarea, and have script save that value by GM_setValue.
This way you can even optimize the speed of your script by having unrelated objects stored in different GM variables.

Categories

Resources