Possible ways to send huge amount of data to PHP server - javascript

I have a step form in a project that handles a lot of data. To prevent errors during creation, all information is stored client-side, and in the end, is sent to the server.
the information sent to the server looks like this:
{
name: "project1",
decription: "lot of text",
schedule:[{weekDay:1, startHour:"09:00", endHour:"15:00"}, ...]
tasks:["task1", "task2"... until 20/30],
files:[{file1}, {file2}, ...],
services:[{
name: "service1",
decription: "lot of text",
schedule:[{weekDay:1, startHour:"09:00", endHour:"15:00"}, ...]
tasks:["task1", "task2"... until 20/30],
files:[{file1}, {file2}, ...],
jobs:[{
name: "job1",
decription: "lot of text",
schedule:[{weekDay:1, startHour:"09:00", endHour:"15:00"}, ...]
tasks:["task1", "task2"... until 20/30],
files:[{file1}, {file2}, ...]
},{
name: "job2",
}
]
...
},{
name:"service2",
...
}
}
And so on..
This is a really reduced example, in a real enviroment there will be 1 project with about 10-15 services, each one with 4-5 jobs.
I have been able to process everything with about 15 items in the last level, and now I´m trying to preprocess data to delete objects not neeeded in the server before send, and with that I expect to be able to send over 50 items in the last level without triggering "max_input_variables exceeded xxx" server side. But still, will be very close to the limit in some cases.
I´m thinking about changing the way I send/receive data, but I´m not sure if my guesses are even correct.
Before some suggest a json request to prevent the input variables error, the request has to bee multipart/form-data to send files.
Said that, my guesses were the following:
Mount all the data as json in a single variable and keep the files in separated variables ( formData would look like {project:{hugeJSON}, files:[file1, file2], services:[{files:[...]}, {files:[...]}] } )
Send partial data during the form fill to the server and store it somewhere, (a tmp file would be my best bet) and in the last step, send only the main form information.
Probably a stupid guess, but is there something like sending chunked data? Ideally, I would like to show to the user a loading bar saying "Creating project--> Saving Service nº1 --> Generating Docs for Service 1..." I think that I could achieve this making my server-side script generate a chunked reponse, but not sure about that.
Well, any help that could show me the correct way would be really appreciated.
Tank you in advance.

Once you are finished filling your object, you should stringify it and send it to the server as a post parameter.
Once you receive it serverside, you can parse JSON and continue working.

Related

How to retrieve data from database and pass it to variables in JavaScript?

I'm working on a trivia game, and need to update the JavaScript code to retrieve data from a database, and pass the data to variables. But I don't know where to begin with making the change. My best guess would be to implement php code to interact with the database and retrieve the data.
The current iteration of my code uses an internal array to create the variables and values for question elements
var questions = [{
question: "Which list contains words that are NOT names of shoe types?",
choices: [ "A. Oxford, jelly, boat, clogs, stiletto, mary jane",
"B. Loafer, gladiator, wedge, mule, platform",
"C. Pump, moccasin, wingtip, sneaker, derby, monk",
"D. Chalupa, dogler, hamster, croonley, frankfurt",
],
correctAnswer: 3
}
//the array contains ten of these objects
];
The database the JavaScript code will retreive from is coded as such:
Insert into TriviaQuestions(id,questionNum,question,option1,option2,
option3,option4,option5,option6,questionAnswer)
Values('1','1','Which list contains words that are NOT names of shoe types:',
' A. Oxford, jelly, boat, clogs, stiletto, mary jane',
' B. Loafer, gladiator, wedge, mule, platform',
' C. Pump, moccasin, wingtip, sneaker, derby, monk',
' D. Chalupa, dogler, hamster, croonley, frankfurt',
'',
'',
'3')
;
//there are ten questions like these
Use any of the major libraries (JQuery) to make an ajax request to a server-side script (written in php, python etc..)
So for example:
function send_request(question){
$.post("ajax.php",{send_question: question},success:function(result){
question = result.new_question;
});
}
The solution you are looking for is AJAX.
JavaScript is a client-side technology. That means that whatever it can do, your client can do as well. If you were to give it access directly to your database, your visitors would have direct access to your database, and that is not what you want to happen. You may also want to keep in mind that your visitors can see all of the code that you send in your JavaScript, and can edit it arbitrarily.
Instead, you want to write most of your code, including the database connections, and even the answers to the questions, to be handled on the server, probably in PHP.
AJAX simply means having your JavaScript send a POST request to your PHP to get an update from it so that you can update the live page on the browser.
You should have a PHP script which takes in requests and gives out little bits of data to talk to your JavaScript.
If you want to go a step further, you can set a session variable in the PHP that can hold all of the data for the session. The JavaScript can ask the PHP script to get a new set of questions, and then can ask it to check the answer, etc.
Here's an article that should help you get started:
https://www.w3schools.com/php/php_ajax_database.asp
This article seems like a good, quick introduction into PHP sessions:
https://www.tutorialrepublic.com/php-tutorial/php-sessions.php

Fetch list of 50,000 most subscribed channels

I'm trying to figure out a way to grab the top 50,000 most subscribed youtube channels using javascript. These only need to be grabbed once and will be stored in a file to be used for an autocomplete input in a webpage.
I've gotten pretty close to getting the first top 50 by using search:list (/youtube/v3/search) by searching with parameters maxResults=50, order=viewCount, part=snippet, type=channel, fields=nextPageToken,items(snippet(channelId,title))
Returning:
{
"nextPageToken": "CDIQAA",
"items": [{
"snippet": {
"channelId": "UC-9-kyTW8ZkZNDHQJ6FgpwQ",
"title": "Music"
}
},{
"snippet": {
"channelId": "UC-lHJZR3Gqxm24_Vd_AJ5Yw",
"title": "PewDiePie"
}
},{
"snippet": {
"channelId": "UCVPYbobPRzz0SjinWekjUBw",
"title": "Анатолий Шарий"
}
},{
"snippet": {
"channelId": "UCam8T03EOFBsNdR0thrFHdQ",
"title": "VEGETTA777"
}
},...
Then all I'd have to do is fetch that 1000 more times using the nextPageToken to get a list of the top 50,000.
Unfortunately, sorting by relevance, rating, viewCount, or nothing is not yielding the 50 most subscribed channels, and there doesn't seem to be any sort of way to order them by subscriber count according to the documentation; so it seems like i am stuck.
Just before you writing your 50 results in file (or database), you can make one more API call, using channelId field from your result, and merge all of them with comma delimited and make another API call Channels: list.
On that page for example you can use following parameters:
(these are IDs from your example above)
part=statistics
id=UC-9-kyTW8ZkZNDHQJ6FgpwQ,UC-lHJZR3Gqxm24_Vd_AJ5Yw,UCVPYbobPRzz0SjinWekjUBw,UCam8T03EOFBsNdR0thrFHdQ`
And result will look something like this:
{
"kind": "youtube#channel",
"etag": "\"m2yskBQFythfE4irbTIeOgYYfBU/MG6zgnd09mqb3nAdyRnPDgFwfkE\"",
"id": "UC-lHJZR3Gqxm24_Vd_AJ5Yw",
"statistics": {
"viewCount": "15194203723",
"commentCount": "289181",
"subscriberCount": "54913094",
"hiddenSubscriberCount": false,
"videoCount": "3175"
}
}
And you can take subscriberCount from result for each channel.
I know, this is not the way to sort your 50 results while writing into the file,
but with this you can sort later your results by "subscriber count" while fetching from file for your autocomplete input.
I didn't find any other way to sort results by subscriber count, so maybe this can be helpful.
The idea to do is to run a server side script, that makes RESTful api calls in a loop, and writes the results to .JSON file, to save results. For that you can create PHP script, that makes REST API call to google, and fetch first 50 results, and then use file write operations to write your results. Run that PHP script as corn job to update results at regular intervals. Executing corn job at every specific time interval you set keeps results fresh.
Hit CURL command with loop for next, to fetches 50 results every time and create temp file with all the results saved in .JSON file. Once your results are fetched, replace your old JSON file with newly created temporary file. This will generate fresh JSON file are regular, with new results if any changes are made to data.
However, the idea to use temporary file is to avoid script avoid wait/slow of AJAX down due to consistent read and write operations on same file. Once temporary file is written, simply use move command to replace the actual file.
Make sure, you use cache control headers in AJAX results to keep its freshness of data.

Angular.js accessing and displaying nested models efficiently

I'm building a site at the moment where there are many relational links between data. As an example, users can make bookings, which will have booker and bookee, along with an array of messages which can be attached to a booking.
An example json would be...
booking = {
id: 1,
location: 'POST CDE',
desc: "Awesome stackoverflow description."
booker: {
id: 1, fname: 'Lawrence', lname: 'Jones',
},
bookee: {
id: 2, fname: 'Stack', lname: 'Overflow',
},
messages: [
{ id: 1, mssg: 'For illustration only' }
]
}
Now my question is, how would you model this data in your angular app? And, while very much related, how would you pull it from the server?
As I can see it I have a few options.
Pull everything from the server at once
Here I would rely on the server to serialize the nested data and just use the given json object. Downsides are that I don't know what users will be involved when requesting a booking or similar object, so I can't cache them and I'll therefore be pulling a large chunk of data every time I request.
Pull the booking with booker/bookee as user ids
For this I would use promises for my data models, and have the server return an object such as...
booking = {
id: 1,
location: 'POST CDE',
desc: "Awesome stackoverflow description."
booker: 1, bookee: 2,
messages: [1]
}
Which I would then pass to a Booking constructor, which would resolve the relevant (booker,bookee and message) ids into data objects via their respective factories.
The disadvantages here are that many ajax requests are used for a single booking request, though it gives me the ability to cache user/message information.
In summary, is it better practise to rely on a single ajax request to collect all the nested information at once, or rely on various requests to 'flesh out' the initial response after the fact.
I'm using Rails 4 if that helps (maybe Rails would be more suited to a single request?)
I'm going to use a system where I can hopefully have the best of both worlds, by creating a base class for all my resources that will be given a custom resolve function, that will know what fields in that particular class may require resolving. A sample resource function would look like this...
class Booking
# other methods...
resolve: ->
booking = this
User
.query(booking.booker, booking.bookee)
.then (users) ->
[booking.booker, booking.bookee] = users
Where it will pass the value of the booker and bookee fields to the User factory, which will have a constructor like so...
class User
# other methods
constructor: (data) ->
user = this
if not isNaN(id = parseInt data, 10)
User.get(data).then (data) ->
angular.extend user, data
else angular.extend this, data
If I have passed the User constructor a value that cannot be parsed into a number (so this will happily take string ids as well as numerical) then it will use the User factorys get function to retrieve the data from the server (or through a caching system, implementation is obviously inside the get function itself). If however the value is detected to be non-NaN, then I'll assume that the User has already been serialized and just extend this with the value.
So it's invisible in how it caches and is independent of how the server returns the nested objects. Allows for modular ajax requests and avoids having to redownload unnecessary data via its caching system.
Once everything is up and running I'll write some tests to see whether the application would be better served with larger, chunked ajax requests or smaller modular ones like above. Either way this lets you pass all model data through your angular factories, so you can rely on every record having inherited any prototype methods you may want to use.

Sequential array issue

Back end code - PHP
Front end - Angular/JavaScript
I am experimenting around with a preferential search on my website - I have users who are mapped to friends, each user can post certain content which can be "liked", my idea for the search was to count how many of the users friends have "liked" resources on the site and sort them from highest to lowest. I have the main chunk of this working (the background code) and have it returning an object that looks like:
{"results":
"post":
{"9": {"message" : "blah9"}
,
"1": {"message" : "blah"}}
}
The number is the id of the post - just a side note, which I'm using to refresh something elsewhere on the site, my problem is, is when I console.log(); this onto the screen it changes to:
{"results":
"post":
{"1": {"message" : "blah"},
"9": {"message" : "blah9"}}
}
Which makes the sorting code kind of useless, is there anyway I can stop this from happening?
$http.post('php/router.php', {'request' : 'search', 'page': 'Search', 'searchString': searchString}).success(function(data) {
console.log(data.results.post);
});
Let the Javascript side of things do the sorting and totally remove the sort from your PHP. Just have the PHP do the pagination of the set (1 to 10, 11 to 20, etc) and the Javascript can order results for you (chunks of 10 from my earlier example) for you.
Probably you'll still have some kind of sort on the PHP side if you have a ton of results to chunk up but the JS can certainly sort out each chunk that is sent to the client.

process csv/json data in java servlets and javascript

I need an opinion on how to approach my problem. I have no idea on how to start and on how to implement which functions on which parts of the software. So this is what I want to do:
I have a Java servlet which creates a simple csv file:
name1, value1
name2, value2
etc.
This needs to be somehow converted to JSON data, so it can be displayed on a jsp page:
[
{
"name": "name1",
"value": "value1"
},
{
"name": "name2",
"value": "value2"
}
]
Then the user will be redirected to the jsp page. Is it possible to send the whole JSON structure via request object to the jsp page? Or is it the easiest if all processing is done in javascript and only the path to the csv file is sent via request object?
I'm kind of lost on this, since I first started last week with programming of web applications. I'd just need a push in the right direction and then I should be able to figure out the rest on my own ;)
First, look for a CSV parser which can turn a CSV file into a List<Bean> or List<Map<K,V>>.
Then, look for a JSON parser which can turn a List<Bean> or List<Map<K,V>> into a JSON string.
Finally, just do the math and set the resulting JSON string as a request attribute which you print in JSP as if it's a JS variable, like so <script>var data = ${data};</script>.

Categories

Resources