I have a website for professional purposes which is displaying some text and graphs (using Google Charts).
I will take an example of one page : like 'http://mywebsite/board.php'
Let say google chart is inside one div in that page
div id='chart'
So my point is I want to run automatically using crontab (every morning 5am CET), some scripts which will screenshot that div where id ='chart' and get a .jpg of the chart.
My idea is then to use that .jpg for automatic email reporting using $email object from php.
Could someone please help me out and give some ideas on how to perform that task?
Related
I'm making a blog website with articles which hyperlink together, and I want to create a node graph which depicts these connections.
I've tried using the AnyChart library which takes input nodes as JSON data with a script on each HTML page which adds the nodes to the JSON file however this requires each HTML page to be run individually to update the graph, whereas I want the graph to generate each time the main blog page is run/ refreshed.
Any help or suggestions would be greatly appreciated :)
I want to fetch this site
https://www.film-fish.com/modern-mindless-action
to fetch the IMDB IDs of all movies listed there.
The problem is that the page loads all movies listed there just after scrolling down. So, a simple wget doesn't work.
Even if I scroll to the bottom of the page and view the source code, I do not see the last movie in the list (Hard Kill (2020)).
So the problem seems to be that the content is being created via JavaScript.
Has anybody a tip on how to achieve that?
So the problem seems to be that the content is being created via a js
script. Has anybody a tip on how to achieve that?
Indeed, executing JavaScript code is beyond scope of GNU Wget. You would need browser automation tool. If you know some Node.js or JavaScript I suggest taking look at PhantomJS Quick Start, Page Automation. Please take look at first example in 2nd link, you should be probably able to rework to your needs, i.e. instruct page to scroll down using JavaScript then extract what you need using JavaScript.
I have setup a dynamic competition page where the query string determines what content you see.
For example (http://nectarfinance.com.au/dc=korinadrogan will show Korina's content, while no query string will show generic head office content).
The site (as is) is loading slowly, and I know it is happening because of the Facebook 'like and share' dynamic Facebook scripts on the page.
I was wondering if there is anyway to minify these script into one? Or if there is anyone to increase the load time of these scripts? or reduce the size of these scripts?
I'm not sure how to work around it as the files are externally hosted by Facebook.
I'll post the GTMetrix report in the answer below, as I can't post two links.
Thanks for your help
I am in the process of making a website on Dreamweaver CC. On one of the pages, I want there to be a section where the page tells you a streamer's name (from livestream) and right next to it, their status of whether they are live or not.
Livestream's api provides a link that retrieves the status in the form of an .xml file. ex. xkainlessx.api.channel.livestream.com/2.0/livestatus.xml. The name of the streamer is in this format: x[channel name]x.api....
if the isLive status is false, I want some image in the html to stay as default (probably just a red dot). If the isLive status is true, I want the image in the html to change to a green dot or something.
I don't know how to use the script tags in html and I don't know javascript anyway so I have been spending a long time trying to solve this problem.
I have searched multiple sites for a solution from stackoverflow all the way to posting in the livestream.com forums.
If anyone can simplify this for me, that would be great!
Is there a way to implement functionality so that a user can Right click a subsection of an Html page (say a DIV or other container element) so that that part can be saved as an image/pdf (using javascript)?
Alternatively (ideally) can this be done on the server side in ASP.NET?
The use case for this is the following:
I have some complex web pages generated in asp.NET and using the javscript Flot library for the graphs. I would like to reuse part of the html page to generate PDF reports or at least image snapshots which can easily be inserted into reports. I have looked around and it seems there is a tool wkhmltopdf which converts the entire page to PDF, however there are 2 issues:
This tool needs to be run separately, which is not friendly for end users
The tool extracts everything on the page, e.g. menus headers , footers etc.
For the second problem I could generate web pages without the headers/footers and menus, and then use the tool, but this does not solve problem 1. Ideally I would like to generate the report weekly and automatically so the user only needs to download it.
For this purpose what is really needed is some way to store as pdf or image a DIV (or other element) referenced by id. This way I would not need to write separate code to generate the reports. I realize there will be a loss of quality converting html to PDF, but for our purposes, this is not that important.
IECapt# is a new and experimental version of IECapt written in C# to render a web page into a BMP, JPEG or PNG image file.
see http://iecapt.sourceforge.net/
You will have to make some calculations, if you want to crop the captured image to your requirements, or give the tool the html u actually want as an image,instead of the whole page.
Hope this helps.
In case this can help others, I finally settled for the iTextSharp library which is very powerful and also handles svg. It does not do the general html5 to pdf dump but with a bit of code I can do most of what I need.
main website is:
http://itextpdf.com/
download:
http://sourceforge.net/projects/itextsharp/