I am having trouble with the exportation of a certain graph. I have made a JSFiddle (http://jsfiddle.net/oy73rgc4/3/) to show with what I am working. This example doesn't contain all the data points with are used, because then my browser (Chrome) crashes. In total i am using about 80K of data points. The HighCharts is displayed like normal and doesn't cause any problems. The problem comes when I want to export the chart!
When I export the chart, doesn't matter if it's PNG/JPG/PDF it always directs to https://export.highcharts.com/ with the message 413 Request Entity Too Large. I have tried some google'ing
offline-export.js
Other people who have experienced this problem had tried to use the JS offline-export. I tried this, but it didn't have any effect.. It just removed the export button in the chart. https://github.com/highcharts/highcharts/issues/4614
Data grouping
Some suggested to others to use HighCharts Data grouping. I checked the API but I find that there is too little explanation about this. I think that I can't implement this from scratch and I am unable to find an example http://api.highcharts.com/highstock/plotOptions.series.dataGrouping
custom exporting server with increased size limit in nginx.conf
I also found that this option could help. I tried to find instructions, but I don't understand how I need to implement this in my web application (Laravel 5.2) http://www.highcharts.com/docs/export-module/setting-up-the-server
Does someone have a new suggestion for me on how I could solve this problem? Or could someone help me out with one of the options which I have suggested?
The exporting server is something that you deploy on your server side (i.e. you have to deploy a server to do exporting for you). However, if you only need to export PNG and SVG, then you can do with client-side only solution as per their docs.
http://www.highcharts.com/docs/export-module/client-side-export
If their server seems to have a limit on how big requests it will serve. Means that you have to deploy your own server and configure it (its has to do with actual http server configuration I think) to accept larger requests. Not much you can do on the client, but to limit the amount of data you display on the chart.
P.S. it always directs you to highcharts export server because export functionality by default users their server.
Related
I have a list of webpages example.com/object/140, example.com/object/141, example.com/object/142, ...
and each page should have a particular background image example.com/assets/images/object/140.jpg, example.com/assets/images/object/141.jpg, ...
Some images are missing and then I use a default image. In that case, when I check if the image exists, I get a 404 error. I have already seen in several pages there isn't a direct way to avoid this problem.
Then I did the following: I created a service in the backend (C#) that checks if the file exists File.Exists(fileName);. That way I managed to avoid this error in my localhost. So far so good.
Now I published both my frontend and backend in two different services in Azure. The images are in the frontend but the file service is in the backend. My method does not work anymore because I can't access directly the frontend folders from the backend. One solution could be to make an http call from the backend to the frontend, but I think this doesn't make much sense, it's getting too messy.
One option could be to store in the DB a boolean with the (non)existence information, but I think this is prone to inconsistencies (if the boolean is not updated immediately when a new image is loaded or deleted, for example), even if I run a daily job to clean it.
Still another option could be to store the images directly in the DB and retrieve them together with the DTOs of the objects I'm loading in each particular page, but I guess that images that are shown only in the frondend should be stored in the frontend... shouldn't they?
Therefore:
a) Is any of these ideas acceptable? Is there a better way to avoid this error?
b) Another possibility: is there a way to access the frontend folders from the backend? I get a bit lost with the publishing and artifacts in Azure and I don't know if I could do it somehow.
I'm not sure how you've built the frontend, but I'm assuming that the background images are set using CSS. It is possible to set multiple background images in the same rule, and the browser will load them all and display them one below the other - if the first one loads successfully, and isn't transparent, then that is the only thing the user will see. But if the first image fails to load - for example because it doesn't exist, the second image will be shown.
See this other answer for more details: https://stackoverflow.com/a/22287702/53538
I've been trying to get data from my database by php and PDO. Before I ask the question, I want to show you the part of the database that i want to get in real-time:
the data I want to get is: CPU_util in real time.
** CPU_util moves between 0 to 100
In the website I'm using the plugin: highcharts in order to display the CPU_util. The chart the I picked up is this (link to the chart):
My question is:
How to get the data in CPU_util from the database and put it in the chart in real time?
(X axis should be as it is (the current time) and the Y axis moves between 0 to 100)
What I've been trying so far:
I did some coding with ajax, but after some tries the website blocks me, because I passed the number limit of http request (2000 requests).
Realtime PHP is a completely different animal compared to 'normal' web apps.
As already suggested, websockets or http-long-polling is the way to go.
The big issue to tackle is dealing with the HTTP request limit and not crashing your server and not starting a php(-fom) thread for each request you are making. To achieve this you will have to rethink your architecture a bit.
To achieve realtime php you'd want non-blocking evented php on the server (nodejs style). In the world of php the most used library for achieving this is Ratchet.
If you want to learn more:
http://socketo.me/docs/
http://www.sitepoint.com/how-to-quickly-build-a-chat-app-with-ratchet/
Also, if you're really doing ALOT of calls to mysql, you may want to move this data to a separate high performance db like Redis
I am supporting an app written on ExtJS 2.0.2. The app works fine on multiple servers. However, when I bring the code down locally, data (from the GroupingStore) does not appear in the EditorGridPanel.
I've tried resetting the paths locally but no luck.
I've confirmed the PHP service is getting called and retrieving data.
The grid comes back with rows but no data. Am I missing something silly here?
Thanks,
Chris
I figured out my issue and you're right Mohit. The data was not getting to the store. After adding a loadexception listener on the store, I saw there was an "Undefined index:" in the PHP file attached to the store. I added an extra check on the array on the variable the PHP script was looking for:
$filter = array_key_exists('filter', $_REQUEST) ? $_REQUEST['filter'] : null;
Once I placed that in the file, the store had data and thus the grid was able to load. What's odd is that I guess the servers on bluehost must suppress these warnings so the store doesn't see them. Thanks for the responses and I'm in good shape now.
I use ajax $.get to read a file at local server. However, the web crashed since my file was too large(> 1GB). How can I solve the problem? If there's other solutions or alternatives?
$.get("./data/TRACKING_LOG/GENERAL_REPORT/" + file, function(data){
console.log(data);
});
A solution, assuming that you don't have control over the report generator, would be to download the file in multiple smaller pieces, using range headers, process the piece, extract what's needed from it (I assume you'll be building some html components based on the report), and move to the next piece.
You can tweak the piece size until you find a reasonable value for it, a value that doesn't make the browser crash, but also doesn't result in a large number of http requests.
If you can control the report generator, you can configure it to generate multiple smaller reports instead of a huge one.
Split the file into a lot of files or give a set user ftp access. I doubt you'd want too many people downloading a gig each off your web server.
I need to export multiple charts to the server with the ajax call and store it as a pdf.
I render multiple charts in a single page using different containers. I need to convert them as images and send them to the server and export them to a single pdf to be saved on the server. Appreciate any help
See this post: http://highslide.com/forum/viewtopic.php?f=10&t=10463
And: http://highslide.com/forum/viewtopic.php?f=10&t=9239
Setting up server side exporting is bit of a mess. Lots of ducks to get in a row before it will work.