I have recently been trying to mess around with JS code of different websites to explore the programming language further. I have been able to successfully modify the scripts of the websites but I cannot find a way to save the changes and they do not take effect. I am using the newer Microsoft Edge (Version 92.0.878.0 (Official build) dev (64-bit)).
It shows that "changes to this file were not saved to the system", how do I go about this?
(please note: this website is external and I have tried downloading the HTML and modifying it but then the site does not function)
This is the default behavior if you don't add the files to workspace. If you want to save the changes to file system, you need to use Filesystem-> Workspace.
You can refer to Edit files with Workspaces about the detailed steps of using Workspace. After Step 1: Set up, you can directly go to Step 4: Save a JavaScript change to disk.
Besides, there's another workaround to test the changed js code without saving to file system. You can refer to this answer about how to achieve this.
Windows file system is preventing chrome from saving file with special chars. Chrome is using filenames to match files on your local version with ones on the webpage.
Because of windows replacing special chars ("?") With escaped ones ("%3"), chrome is unable to match them and shows that error.
I don't know the perfect solution for that, but one of the workarounds would be to remove cache canceling string with question mark. Very nasty bug of local overrides.
Related
Final Update
For most practical purposes, this question is obsolete as both firefox and chrome have native support for avif through the standard picture html tag with a source marked as type="image/avif". See https://reachlightspeed.com/blog/using-the-new-high-performance-avif-image-format-on-the-web-today/ . Fire fox still likes to hang and often forces a control F5 to bypass caches and requires sending the correct content type from the server. Hopefully will be fixed soon.
Here is the commit where I got avif support working: https://github.com/quackack/quackack-comics/commit/f1a98ed1f40b6a22584d61bc338bd91df3232fa5#diff-e25b0950ce48f4e928f98e0a6fbb694c . Note that it contains many unrelated changes and in fact avif is barely mentioned, only as a content type and a file extension.
Original Question
I am trying to change a website where I host web comics from using jpegs to the newer avif image format. It is much smaller and seems to be the new image format with the most widespread support. Unfortunately, web browsers don't properly support the new format yet. So I was planning to use this package: https://github.com/Kagami/avif.js to allow my comics to be rendered. Some basic tests showed that AVIF would give the same quality as jpeg for less than half the space.
Unfortunately, after more than 5 hours of time spent on this, I am unable to get this to work with my react framework. You can see my website at the time of writing at 'https://github.com/quackack/quackack-comics/tree/cda4c3893d8477192c4ff3aa78d00096b7621ff7'.
I tried using npm install to install avif.js and then added
require("avif.js").register("/avif-sw.js");
to index.js . But I get error
Failed to register/update a ServiceWorker for scope ‘http://localhost:3000/’: Bad Content-Type of ‘text/html’ received for script ‘http://localhost:3000/avif-sw.js’. Must be a JavaScript MIME type
And avif files are still not able to load. I think the requests are getting rerouted to index.html instead of the javascript package. It seems like the appropriate thing to do is something like
import * as avif from 'avif.js';
avif.register('avif.js/avif-sw.js');
But this fails too with the same error, as do many other similar variations.
At this point I am inclined to wait for proper browser support for avif, as I don't get enough traffic to worry about data costs anyway. If this could easily be fixed, then I would love to have the improvements from avif. I just want smaller file sizes and widespread browser support.
Update
Okay, I found that I could get this to work if I changed from the default react bundler (which I believe to be webpack) to Parcel. Then it does work exactly as you expect... until I try to deploy the project.
There is an issue where I cannot load the service worker when i try to deploy my single page webapp to AWS as a single page web app. There it makes a request to my url with avif-sw.js where there is not actually a js file. I believe the issue is closely related to https://github.com/parcel-bundler/parcel/issues/670
So the first key is to use Parcel to build your web app. But Parcel still does something wrong with deployment it seems. I will continue to investigate this in a few days.
Here is the almost working version using parcel: https://github.com/quackack/quackack-comics/tree/parcel
Update 2
My earlier update was incorrect. I only thought it was working because of a cached service worker. My final solution is in the answer below.
It doesn't seem to be a problem now. Simply convert your images to avif with tools like https://avif.io/ and use them as the image source or background source via typical CSS. As Chrome and Firefox now support it (even though users still have to enable it on Firefox), everything goes well. Even works on mobile now! :)
Okay, I finally got it working. Unfortunately, mobile devices don't seem to be able to handle the large file sizes so I had to keep using jpeg anyway. It worked on my laptop though.
Here is the commit that got everything to work: https://github.com/quackack/quackack-comics/commit/75e75307e688f0e515b4bbc9eb22eef290d2c209
What I had to do:
Switch to Parcel.
Copy the contents of the avif.js library into source before building. I used the command:
copyfiles -f node_modules/avif.js/*.js .
Put this specifically into reg.js:
require("avif.js").register("./avif-sw.js");
navigator.serviceWorker.register("./avif-sw.js", undefined);
What the last two steps do is trick parcel into actually keeping a copy of "avif-sw.js" around that can actually be loaded as a service worker. Probably with a bit more tinkering you can get this to work without using parcel at all, just by copying local and then registering. No requires required. But I stopped investigating after I found this solution can't work on mobile.
This was exceptionally hard to debug because service workers are cached by the browser and I had to clear broswer data after every edit. It was also hard to debug because the source files are cached to so I had to delete my projects cache and build frequently too.
You might also want to use npm module "http-server-spa", or similar, to test how your built SPA will act when deployed.
I'm trying to write a plugin for TFS 2015 (its important). I read a couple of manuals. the examples all turns out simply, but it is more difficult with a real plugin. my problem: where can i keep settings for my tfs extension?
I want to move requests addresses from the code in the some type settings(they will be the same for all users), so that during work, I could quickly change them without changing the plug-in code. but i cant find any solution of this problem.
i find this info: https://www.visualstudio.com/en-us/docs/integrate/extensions/develop/data-storage but it still keep settings in code.
I will be grateful for any information.
Not sure if I totally got your point. You just need to change your own code with the settings of your extension and publish to TFS. Then all user on the client side using your extension will automatically update the setting. Then don't need to manually change the code or update.
Update
Unfortunately, this is not support for now. You have to through the code to change the settings and rebuild, package the extension. Update/ install and test your extension.
I'm developing a Symfony2 application using PHPStorm IDE.
I can't seem to make it work. I've tried Javascript Debug for both local and remote with multiple parameters. Messages varies from "Remote URL isn't specified for so breakpoint..." etc.
Best case scenario is that I would be able to debug my Javascript codes inside PHPStorm. Is this possible?
I'm also using AsseticBundle for my assets.
I use PHPStorm too but I only use it for coding, debuging is in browser..
So I would say it depends on if PHPStorm executes the app_dev.php and graps the output.
If you already have experience with PHPStorm (Debuging) you could answer this. I myself don't think so...
Hope I could help at least a little :)
JavaScript debugging is definitely possible from the PhpStorm if you are using Chrome or Firefox browser. Debugging JavaScript and PHP at the same time is not supported at the moment.
I was able to set this up
Install the JetBrains Chrome extension
In PHPStorm click the Run/Debug Configurations just to the left of the green bug near the top right
Add a Javascript Debug (using the plus sign)
In the URL box, just enter the URL of the page you're debugging (e.g. http://my.test/app_dev.php/route/here/not-there/3290), name the config, and apply
Add breakpoints in the js
Make sure dev tools is closed in the browser, then click the green bug (ensuring that the config named in 4 is chosen) and it will redirect to the page specified in 4.
The breakpoints will now be activated and are able to be stepped through (like xdebug). When it stops on a breakpoint it may be in a version of the original file with the file's name appended with a version number (this file is read-only). There is a button to the left hand side of the name where you can reload the read-only file being stopped at, after you have updated the actual (writable) file.
How is it possible to programmatically save a web page snapshot with all its elements (css, js, images, ...) into one file?
I need to archive some web pages regularly. However, just saving their HTML code is useless - not only because of images missing but esp. because the absence of CSS on today's pages can turn a web page into unrecognizable mess.
I remember the .mht format that worked like this, but that required manual saving, and it was just a feature of IE. I believe there is an open-source solution that can achieve this programmatically, but despite hours of searching I cannot find it on the web.
HTTrack, -%M
Use wget in terminal
wget -p -k http://www.example.com/
It'll make a clone of site frontend html, css, js, svg etc. But not in one file as asked. Rather, it'll recreate the whole folder structure
E.g. if folder structure of www.example.com is as
/css/*
/js/*
/index.html
then it'll create the same structure locally.
Docs: https://www.gnu.org/software/wget/manual/wget.html
I think #reisio (+1) has you covered...
...But if only to plug a great free tool, I would point out the Firefox extension Save Complete, which does an admirable job of grabbing "complete" pages on an ad hoc basis. The output will be a single HTML file with an accompanying directory stuffed with all the resources - you can easily zip them up for archiving.
It's not without fault - I've had issues with corrupted .png files lately on OSX, but I use it frequently for building mockups off of live pages and it's a huge time-saver. (Also of note, it hasn't been updated for FF 4 yet, and is the sole reason I rolled back to 3.6)
If you are using Google Chrome just use the save page as menu entry (CTRL + s), and select complete website from the options at the bottom of the file dialog. This save the HTML and all required resources (in a separate folder).
Apple's Safari has a pretty good solution. It saves all HTML and CSS (sadly no JS) but in a format called webarchive. It's one file, but it requires Safari to save and open, and Safari requires a Mac. Even though Safari for Windows does exist, it's too old to work with webpages, and it doesn't even support saving as webarchive, or opening them. If you have a Mac, open any website in Safari and press ⌘S and then make sure that Web Archive appears in the drop down.
There is also a Chrome extension that can open these types of files, but not save them.
Apologies for replying to such an old thread, just wanted to spread this info!
I have a web page which includes insane amount of minified JS files. The web page works perfectly fine on my local network but throws some JS error on staging. There is an issue in JS and I wan't to debug it. When I load the JS in Firebug's script tag it appears in one long horizontal line. Is there a way out in Firebug that expands or beautifies the script for debugging? I know I can use jsbeautifier but they wont help me. I can not upload an expanded file to CDN, defeats the purpose of using CDN.
Points to be noted,
a) I can not control the box which serves JS, its on CDN, I mentioned it.
b) I can use beautifiers etc but would that help me in debugging the script in run time? IMHO, no
c) Being bound by NDA and other legal things I can not share the script but its a generic problem, you can encounter it with a minified jQuery
Beautify your script
Add the CDN host in /etc/hosts or your local DNS to resolve it to your own web server
Host the beautified version and everything that you need on said web server
Both Firefox and Chrome (versions as of this edit) support beautifying script with the {} button available in the inspector.
Just load the minified file and press the {} button at the bottom and it instantly beautifies, making breakpoints and other debugging possible. (True for Chrome too)
This is a common problem and the Chrome dev team have recently come up with an elegant solution, which they've called Source Maps - see http://www.html5rocks.com/en/tutorials/developertools/sourcemaps/ for more info, I think you'll find it's exactly what you (and the rest of us) have been crying out for! :)
This is more a workaround, but it can help. The idea is that we will replace files coming from the server by files on your machine.
This will work with any browser.
It takes a bit of setup the first time (15 minutes maybe), but then it can be very convenient.
It can also helps testing your bug-fixes in a live/prod environment.
Get Fiddler (it's a web debugging proxy), install it, run it.
http://www.fiddler2.com/fiddler2/
(Restart browser after install to get the Fiddler extension)
If you debug an HTTPS website, check this first:
http://www.fiddler2.com/Fiddler/help/httpsdecryption.asp
From now on, you should see in Fiddler ("Web Sessions" pane on the left) all downloads made by your browser, including JS files.
If not, check this : Fiddler not displaying sessions
Find the file you want to debug in the list (Ctrl+F works)
Click on the file. Then either:
get the file content from the inspectors pane (textView tab), beautify it, save to a file on your local computer
or have access to a file which contains the source code (ex: from your source control)
Go to AutoResponder tab (top left pane).
Select "Enable automatic responses" checkbox.
Select "Unmatched requests passthrough" checkbox.
Drag your file from left pane to right pane (prefills rule editor at the bottom)
Set the other field with the path of your local file
Click the Save button
Reload the page and enjoy your debugging session.
Fiddler can do many more things, but this use-case answers the initial question.
Consider a Change!
Firefox w/ Firebug was my favorite JavaScript debugging method for almost a year, but I've recently moved to Google-Chrome's Developer-Tools which is far more superior.
Chrome supports an On-The-Fly (built-in feature) beautification of JavaScript resources
Once beautified, you are free to debug the JavaScript resource file, as it was "natively" downloaded beautified from the web-server. Breaking-points are set by clicking the line number.
One of the most extremely powerful feature,
Is once You've Stopped In A Breaking-point, You Are Free To Execute Commands (using console) In The Same Scope You ARE In The Breaking-point. In Firefox you can't do that.
Its so easy to debug (even anonymous functions), You'll never be back to Firefox.
Try It!
Pretty-print your JavaScript. Google this and you'll find multiple on-line JS beautifiers.
I happen to use http://jsbeautifier.org/ myself and it works fine, but search for others and use one that suits your needs.
Caveat: You still won't be able to get meaningful local variable names (which are usually renamed by a minifier). If the code was compiled by the Closure Compiler, then you absoutely won't get any useful information back at all, even when beutified, because then all variables and functions and properties are mangled (not only local ones).
Now, if your problem is with debugging code that comes from outside (e.g. a CDN), obviously that code would be minified, and you can't save your beautified version back there. In this case, you can replace the tags that load code from a CDN with a url pointing to your local version, then you can beautify the code (downloaded from the CDN) into your own server and you can then debug with FireBug.
Now, if you don't even control the HTML that contains those tags (e.g. they reside on a outside server), then unfortunately there is no way for you to do what you want without physically downloading the entire site to your own server. Even if you downloaded the entire site (with all the files), it may not work since the site may be driven by a back-end processing language or accesses a back-end database. In such case you'll also need to simulate all those data. It can be done, however -- you just have to go through a lot of pain. My recommendation is to save a version of the web page and run it on your own server, serving beautified code from your own server to debug.
Placing breakpoints on JavaScript makes debugging much easier, but if your code has already made it to production then it's probably been minified. How can you debug minified code? Helpfully, some of the browsers have an option to un-minify your JavaScript.
In Chrome and Safari, simply select the 'Scripts' tab, find the relevant file and then press the "{ }" (pretty print) icon located in the bottom panel.
In Internet Explorer, click the tool icon by the script selection drop down to find the option to format the JavaScript.
Opera will automatically prettify minified JavaScript. Source