I have a basic bar chart that retrieves data from my database and this data is stored into this chart on the front-end.
1) The data is not in an array
2) The data is not in a file
3) JSON.parse will not help in my particular situation
4) The data receives a JSON response via API and authentication
My question is, for web developers, in Javascript, how exactly do I go about developing that analysing a json request to find a certain number is over 10, as an example and for this particular situation.
No data is stored in the coding. But I imagine is there a way to read this data through the console using the javascript? If so, please point me to the right direction. Thanks!
Related
I am looking a proper solution to implement the below scenario.
I have a large amount of json data to process, previously the whole data is send to the server using rest apis and server perform certain action on each json row, after a very long time server sends the response back with the processing status of each row. but this approach always make the user confused, whether it is processing or not (because they are seeing the loading screen for over a 10 mins), since i am using rest apis I can't get any status while processing the data.
I am looking
Is there any good way to send the data in small batches ?
how can I get the status from server while processing the data ?
my frontend is ReactJS and backend is nodejs
As per your given situation. You can use the pagination and send the data in a limited number of data packets.
If the user wants more data he can click on the next or previous data. In this case, I would suggest ANTD library. It works perfectly for pagination. For your reference here is the link for the trial.
You can also refer to this codeSandbox by Andrew Guo.
https://codesandbox.io/s/m4lj7l2yq8?file=/src/index.js
https://ant.design/components/pagination/
AS you mentioned the data is taking too long, for this I may suggest indexing in the database.And you can show Suspence until the whole data is loaded. This may help the user to understand that the data is been loaded.
Here is the link for your reference
https://ant.design/components/spin/
Picture for context
My Community Connector fetches these 2 fields (Subscription Date and Clicks).
I want to be able to filter by date so my table only shows, for example, data from the last 7 days. This works using the Date Filter that Data Studio provides, however, I notice that this date filter does another fetch request with the correct date I selected.
I don't want this to happen. I want to filter by date USING MY EXISTING DATA. Is there any way to do this? To filter only using my cached data, and not send a new GET request?
While this is not doable from Data Studio side, you can implement your own cache in Apps Script. You can evaluate each getData request and return data from the cache if needed. This will avoid sending new GET requests to your API endpoint.
While this may not be the best option to everyone, I found a quick temp solution by loading my own community connector USING google's community connector, Extract Data.
This way my data loads only 1 time, and I can filter it instantly the way I want.
If you want to refresh the data, you 'edit' the data source and save.
So I found this Pie Chart that I would like to use on my website (http://canvasjs.com/docs/charts/chart-types/html5-pie-chart/)
I've already adapted the code to where it establishes a connection to the MySQL database gets the information I need from and I've saved those as variables in PHP and displays them within the Pi chart. So far so good!
Now I've really like to make this a little bit more real-time as the information changes quite rapidly, so I was thinking, of having the jQuery update its information on a regular basis. Preferably every 1000ms or so. How would I go about achieving this?
Thanks for your suggestions!
Basically you will need to establish an ajax data flow:
your main page will contain only the graph, but skip the data
request the data by sending an ajax request to a separate page
the data page should return your data object in JSON format (use json_encode())
With periodicalupdater you can update your data in the interval of your choice and automatically adjust this interval to reduce the load on your server.
There's a worked example in the documentation: http://canvasjs.com/docs/charts/how-to/live-updating-javascript-charts-json-api-ajax/
Basically, you need to use JavaScript and more specifically Ajax to query the server continuously and fetch new data in JSON format. Then update the chart using the brilliantly named 'updateChart' method. :)
I've been trying to get data from my database by php and PDO. Before I ask the question, I want to show you the part of the database that i want to get in real-time:
the data I want to get is: CPU_util in real time.
** CPU_util moves between 0 to 100
In the website I'm using the plugin: highcharts in order to display the CPU_util. The chart the I picked up is this (link to the chart):
My question is:
How to get the data in CPU_util from the database and put it in the chart in real time?
(X axis should be as it is (the current time) and the Y axis moves between 0 to 100)
What I've been trying so far:
I did some coding with ajax, but after some tries the website blocks me, because I passed the number limit of http request (2000 requests).
Realtime PHP is a completely different animal compared to 'normal' web apps.
As already suggested, websockets or http-long-polling is the way to go.
The big issue to tackle is dealing with the HTTP request limit and not crashing your server and not starting a php(-fom) thread for each request you are making. To achieve this you will have to rethink your architecture a bit.
To achieve realtime php you'd want non-blocking evented php on the server (nodejs style). In the world of php the most used library for achieving this is Ratchet.
If you want to learn more:
http://socketo.me/docs/
http://www.sitepoint.com/how-to-quickly-build-a-chat-app-with-ratchet/
Also, if you're really doing ALOT of calls to mysql, you may want to move this data to a separate high performance db like Redis
I would like to dynamically create a corresponding JSON file every time a new merchant signs up to my site.
For example:
Burger King signs up to be a merchant on my site. I add Burger king to my merchants.json file. How would I dynamically create a file that gets inserted into that JSON object that can later be used to pull up data specific to that merchant, on that merchants page. For example, a JSON file full of products or deals.
Is this even the right way to go about it?
Can someone point me in the right direction please?
This seems like a very common usage scenario but I don't see any examples online that explain this application structure thoroughly.
NOTE: I am using AngularJS
EDIT: Thanks for the tips guys, after asking around in the #AngularJS channel on IRC, I've been told to go the extra mile and create an API. I'll probably use sails.js to help with that. This obviously isn't how I was planning to do things, but as you guys pointed out, it wasn't the best practice; not by a long shot.
1) You'd need a small server-side PHP script that accepts the new JSON file sent by the client
2) Browser requests merchants.json from the server
3) Load it with JSON.parse()
4) Add the merchant to the Object
4) JSON.stringify(object)
5) Send back to the server.
Now, this is a horrible horrible idea. Use a server-side database for storing any kind of information on your clients -- MySQL, whatever. JSON is for transporting data, not storing it. You can write some PHP scripts to dynamically generate a page for your merchant based on the data in the database -- so much easier and so much more secure. The method above will expose the whole client database to the client, and based on your specifications above, I don't see another way.