Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
Improve this question
I'm making a RSS reader in flutter, however my requests to the feed are blocked by Cloudflare.
I've been looking for a way to emulate a browser with javascript enabled, since it is needed to pass the Cloudflare test, but nothing seems to have that functionality.
What I need is a simulated browser, that renders the page requested, execute the javascript contained in the page. I haven't found anything that claims to do that aside from webview_flutter, which is a widget and thus cannot be used I my case.
I find it weird that there is no such thing as a simulated browser for Flutter, so I must have missed something.
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
Improve this question
I'm hosting a side project with firebase, and for some reason I'm getting a warning from chrome when I try to visit it (but not from Firefox). I scanned the website for malware with multiple services (sucuri, scanner.pcrisk), and couldn't find anything. The website is just a webpage, no server. Can anybody help me understand what is going on?
Website:
https://netflix-app-4bcc7.web.app/
Notably, on Firefox it's not blocked by the padlock symbol has a warning indicator, which when I click on tells me that some of the content is not secure and there is mixed content. Have trouble understanding what this means though
It could be that your website is mimicking the actual Netflix site. Chrome might have noticed this, and thus has determined that your website could be a phishing site.
If this is a side project, perhaps changing the website name and especially the url to something more unique will resolve this issue.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
I have a website heavily javascript-based. Specifically, there's no <a> tags. The content is dynamically inserted in the DOM tree when some buttons are clicked, and then, the URL is changed using javascript to represent the updating.
So, my question is, if I have a list of links in my robots.txt, will the allowed web crawlers (Google, Bing, etc) directly access the links in robots.txt, or will they follow the a links presented in the downloaded website and allowed in robots.txt?
Because in the second case, the web-crawler will not find any URL appearing both in the downloaded / site and the robots.txt file.
You could use Sitemaps to give crawlers a list of URLs. As mentioned by #Barmar, the purpose of robots.txt is slightly different.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
I don't know anything about javascript web programming... but want to learn.
My question is : Can I use visual studio 2015 to write, debug and publish(?) javascript web applications.
Again... I don't know if 'publishing' is the right terminology for writing web applications in Javascript... of if you can even 'write' web applications in Javascript...
Any information on the topic would be good to know.
thanks
Yes, you can use Visual Studio to create JavaScript files. It also has tools to publish, depending on the app and environment there are a lot of ways to go about it, and VS can support a lot of them (repositories, FTP, etc).
I recommend just diving in, firing up a blank project and find some tutorials to get you started. JavaScript is probably one of the easiest to develop with (at least getting started) because it has no special needs other than somewhere to write text.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
While browsing I came across this link
As you can see image is protected, if you browse cars-database.com you take source image and you will get the same message.
This is first time I see this and I wonder how this has been implemented? Does anybody know?
Screenshot of protected source image:
Here's a simple example on how to implement something similar in nginx:
location ~ \.(jpe?g|png|gif)$ {
valid_referers none blocked mysite.com *.mysite.com;
if ($invalid_referer) {
rewrite ^ http://mysite.com/lowres$request_uri permanent;
}
}
They likely simply have two versions of the image stored on the server, the "real" one and one with the extra message added via a Photoshop template or something. They then employ a check for the Referer header like:
if request for image and referer is not cars-database.com/*
then serve "watermarks/$requestedImage$"
This can be implemented trivially with an Apache mod_rewrite rule, any other web server's rule system or any server-side programming language like PHP, Python or whatnot.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
I've downloaded Canviz (it's a Javascript library for drawing graphs). I've unzipped the archive. I open, with Chrome, index.html, which is supposed to be an example. A "loading" message appears... and never disappears.
What am I supposed to do please ?
I worked with this library a few months ago.
You also need to have graphviz installed in order to create the graphs canviz will draw.
When you have installed, you have to use the function load with the url parameters in order to create the graph.
WebBrowsers have an option to enable javascript or not. make sure your browser enable it.
here's an example for Mozilla : Mozilla example