I have different environment on which I'd like to test my manifest and my ServiceWorker howerver I can't find a way to have multiple manifest.json, one for each env.
Is there any way ?
You can have your build process to pick appropriate manifest.json file during the build process. For example, if you are using webpack, you can write script to pick pr/manifest.json when you do pr build and qa/manifest.json when you do qa build.
Manifest is liked to your application by linking to your home page, say index.html. So you can also manipulate your script link tag to pick based on lcp. But this would be hacky and not a good idea. I would go with first option.
Hope this helps.
Related
I would like to know how to make my Next.js file work on every computer without having to reinstall all my modules.
Basically, I have my folder, nextjs-node, containing the following folders:
components
lib
public
node_modules
page style
package.json
I'd like to know if there is something I could do to create an icon on which I can just click to launch my website without have to go to the folder in my terminal and type npm run dev .
As a heads up, stack overflow is more for asking specific questions rather than general project approaches, as a result this will most likely be flagged eventually.
However to answer your question:
So next just creates a framework for websites which means that what you want to do isn't quite possible however there are a couple of options here:
You want to have an executable that launches your platform as an electron app. An example could be done with: https://github.com/saltyshiomix/nextron
You can always just host this as a website, and users can use shortcuts to just link over to the site in a web browser.
You can make your site a progressive web app where the user can save your site locally (however you will still need to do option 2).
Every time we release a new version of our software which is bundled using Browserify, we are finding that we need to ask our users to clear their cache using the regular methods of CTRL+F5 or diving into the browser settings. It is not ideal when there are a thousand or so users. We are trying to work out a way that we can perhaps get around this. I am open to all sorts of options.
Our project is ReactJS based, so runs in the browser and connects to back end services via a RESTful API. We do track which version is loaded and this is visible from within the console. Using the version number we can compare on two different machines that one user is running the latest version whereas someone else may not be.
The code is bundled into two separate files and I feel that this is where we should be looking.
You need to change the file name on each new release.
A hash of the file is an appropriate thing you could add.
Check out md5ify to add this to your project build.
If you implement this yourself, make sure to also load the correct filename in your index.html file.
Edit:
To automatically load the correct file you need to have a placeholder in your main html.
Then you need a manifest.json file that looks like following:
{
"main.js": "main.[HASH].js"
}
This has to be created automatically after the bundling.
Now you can replace the placeholder with correct asset by doing a lookup in the manifest file.
You either have to write your own scripts for this or use something like gulp together with browserify.
Another solution would be webpack
In case the question wasn't clear. I have 3 MVC projects in one Solution. Every time I create a new project it adds the "Scripts" folder with all the .js files I'll ever need. I don't want to have this created every time for every application. Is there a way to reference scripts from a central folder in the solution so all applications/projects can share one common script folder with all the scripts common among them?
Edit:
Please explain the pros and cons of doing this if there are any...now I'm curious.
Here is what I would recommend:
Right click the solution and create a New Solution Folder called Common Javascript Files (or whatever you feel like calling it.
Right click on the Solution, click Open Folder in Windows Explorer,
or navigate there manually for other versions of Visual Studio :(
In the solution directory, create a directory with the same name as the solution folder (solution folders do not normally match directories at the source code level but this will for sanity sake).
In this new directory, add files that need to be shared between solutions.
In Visual Studio, click the solution folder and select Add - Existing Item.
In the file selection dialog, navigate to the directory previous created, select the file(s) added to the directory and click Add.
In each Project that needs a shared file, right click on the project (or directory within the project) and click Add - Existing Item.
Navigate to the shared Directory, Select the files and click the drop down arrow then click Add As Link.
Now the files in the projects are essentially short cuts to the files in the Solution Folder. But they are treated as actual files in the project (this includes .CS or Visual Basic files, they will be compiled as files that actually exist in the project).
PROS
Files are truly shared across projects at Design time
Only the files needed for each project can be added, it's not all or nothing
Does not require any configuration in IIS (virtual directory etc)
If the solution is in TFS Source control, you can add the Directory to the TFS Source and the shared files will be source controlled.
Editing a file by selecting it in the Project, will edit the actual file.
Deleting a Linked file does not delete the file.
This is not limited to JS files, linked files can be ANY file you might need (Images, Css, Xml, CS, CSHTML, etc)
CONS
Each deployment gets it's own file.
There is a small learning curve when understanding that Solution Folders are not Directories that exist in a Solution Directory.
The best thing to do, imo, is to roll your own CDN... Basically just create another site in IIS and give it it's own binding, e.g. "http://cdn.somedomain.com"
Then store all of your css/js/fonts/shared images etc on the CDN site and link to them from your other sites.
Doing so solves 2 problems,
All of your stuff is shared when it needs to be and you only have to manage 1 revision per file.
Your users browsers can cache them in 1 single location instead of downloading copies of your stuff for every site that uses them..
I added this answer because I see a lot of people referrencing creating virtual directories. While that does indeed share the files, it creates multiple download paths for them which is an extreme waste of bandwidth. Why make your users download jquery.js (1 * number of sites) when you can allow them to download it once on (cdn.somedomain.com).
Also when I say waste of bandwidth, I'm not just talking about server bandwidth, I'm talking about mobile users on data plans... As an example, I hit our companies HR site (insuance etc) on my phone the other day and it consumed 25mb right out the gate, downloaded jquery and a bunch of stuff 5 times each... On a 2gb a month data plan, websites that do that really annoy me.
Here it goes, IMO the best and easiest solution, I spent a week trying to find best and easiest way which always had more cons than pros:
Resources(DLL)
Shared
images
image.png
css
shared.css
scripts
jquery.js
MvcApp1
Images
Content
Shared <- We want to get files from above dll here
...
MvcApp2
Images
Content
Shared <- We want to get files from above dll here
...
Add following to MvcApp1 -> Project -> MvcApp1 Properties -> Build events -> post build event:
start xcopy "$(SolutionDir)Resources\Shared\*" "$(SolutionDir)MvcApp1\Shared" /r /s /i /y
Here is explanation on what it does: Including Build action content files directory from referenced assembly at same level as bin directory
Do the same for MvcApp2. Now after every build fresh static files will be copied to your app and you can access files like "~/Shared/css/site.css"
If you want you can adjust the above command to copy scripts from .dll to scripts folder of every app, that way you could move some scripts to .dll without having to change any paths,here is example:
If you want to copy only scripts from Resources/Shared/scripts into MvcApp1/scripts after each build:
start xcopy "$(SolutionDir)Resources\Shared\Scripts\*" "$(SolutionDir)MvcApp1\Scripts" /r /s /i /y
This is a late answer but Microsoft has added a project type called Shared Project starting Visual Studio 2013 Update 2 that can do exactly what you wan't without having to link files.
The shared project reference shows up under the References node in the
Solution Explorer, but the code and assets in the shared project are
treated as if they were files linked into the main project.
"In previous versions of Visual Studio, you could share source code between projects by Add -> Existing Item and then choosing to Link. But this was kind of clunky and each separate source file had to be selected individually. With the move to supporting multiple disparate platforms (iOS, Android, etc), they decided to make it easier to share source between projects by adding the concept of Shared Projects."
https://blogs.msdn.microsoft.com/somasegar/2014/04/02/visual-studio-2013-update-2-rc-windows-phone-8-1-tools-shared-projects-and-universal-windows-apps/
Info from this thread:
What is the difference between a Shared Project and a Class Library in Visual Studio 2015?
https://stackoverflow.com/a/30638495/3850405
A suggestion that will allow you to debug your scripts without re-compiling the project:
Pick one "master" project (which you will use for debugging) and add the physical files to it
Use "Add As Link" feature as described in Eric's answer to add the script files to the other projects in solution
Use CopyLinkedContentFiles task on Build, as suggested in Mac's comment to copy the files over to the second over to your additional projects
This way you can modify the scripts in the "master" project without restarting the debugger, which to me makes the world of difference.
In IIS create a virtual folder pointing to the same scripts folder for each of the 3 applications. Then you'll only need to keep them in a single application. There are other alternatives, but it really depends on how your applications are structured.
Edit
A scarier idea is to use Areas. In a common area have a scripts directory with the scripts set to be compiled. Then serve them up yourself by getting them out of the dll. This might be a good idea if you foresee the common Area having more functionality later.
Most of the files that are included by default are also available via various CDN's.
If you're not adding your own custom scripts, you may not even need a scripts directory.
Microsoft's CDN for scripts: http://www.asp.net/ajaxlibrary/cdn.ashx
I'm trying to figure out a way to automatically generate an ApplicationCache manifest file from all the HTML,CSS,JavaScript and images files used by our website.
We need this because we need to support offline usage of the website. More precisely, offline usage of an ArcGIS API for JavaScript webapp.
We are not using service workers instead of the ApplicationCache because supporting iOS is a critical requirement and service workers are not supported at all on iOS, on any browser.
The idea is that I'll manually call a function after the site is fully loaded that will dynamically create the text to be used for the new manifest. Then manually copy/paste it in the manifest file. So it's something I would only do when something in the site changes and the manifest file needs to be updated.
This tool, ManifestR, is very close: http://westciv.com/tools/manifestR/
but has two issues with it:
1- It does not handle image file URLs found in CSS files properly. For instance if it finds url(../images/myimage.png) it will add the relative link ../images/myimage.png directly in the manifest file instead of adding the non-relative link like www.mysite.com/images/myimage.png.
2- It does not list any of the scripts loaded through dojo.require (AMD modules).
I'm thinking of using similar code to fix these issues and compile the list of files. I already see how to fix #1, but can't figure out how to fix #2.
So, using JavaScript, how can I find the list of all script URLs used by the website, not just those loaded trough tags (found in window.scripts object), but those loaded using AMD modules as well?
Basically I want to compile the same list that Chrome is showing me for the website in the Sources pane.
Ex:
I'm thinking that if this isn't available anywhere, maybe I could create a proxy function to dojo.require that keeps tracks of all files loaded through AMD.
But I wanted to ask here first, maybe I missed a tool of script that already does this? Or maybe my plan isn't good?
Thanks
I've never used ApplicationCache for an ArcGIS API for JavaScript app, but I would recommend that you first serve up a custom Dojo build of your application in order to bundle your code into one or more build layers. If you configure your Dojo build properly (no small feat) you should know the exact scripts that will be required.
Also, I'd suspect that once you figure out how to get the list of scripts, you may have special considerations in order to get the Dojo AMD loader to be able to use the cached files. See: dojo and the offline application cache
Good luck.
I've got a painfully simple jQuery plugin that I've written and placed on github. I am using Github for Windows and the website itself to manage the project.
Unfortunately, if I try to include any of the .js or .css files that I've uploaded there, through the Raw links, it fails in my browser due to the MIME-type being plain/text.
So, for the last couple of hours I've been researching how to get a copy of the files, through github, that people (including myself) can link to. The first step seems to be creating a project page (gh-pages branch)... that much I have grasped.
However, all of the material I've found so far either expects you to have a UNIX-based system, or do some console-based trickery:
Examples: GitHub, SO, SO
Now, there must be a straightforward way to simply make these source files available for inclusion. I went through the automated steps of creating a 'project page' and now I'm presented with another branch that is claiming to be behind the 'master' branch, but I can't see what I'm supposed to do next. It's not even clear to me why on earth I'm required to make another branch. This whole thing seems far more complicated than it needs to be.
So, to recap:
I've created a branch in Github (using Windows app and website)
I can manage that, and update my files, without incident
I am unable to include the .js and .css files using a 'raw' links
I want to be able to include those files in a page
I'd like to do this through Github for Windows, or on the site itself
If anyone could help walk me through this, I'd appreciate it. Also, I'd expect that A LOT of others would as well.
EDIT: Here is an example of a well-known Github project that has its files available through Github:
Select2:
http://ivaynberg.github.io/select2/select2-3.4.2/select2.js
EDIT2: Okay, conceptually, I now understand why I have to create a separate branch, in order to share the files - as the source control aspects of Github aren't meant to act as a CDN, the project page simply provides a public website where you can place your files. So the question now becomes: How do I put my files from the master branch into the gh-pages branch? I'm not worried about automating it or anything right now, all I want is access to the directory structure so I can place files in there. I've tried syncing and re-syncing my branch with Github Windows, but it tells me that there's nothing to get from the gh-pages branch, even though it's "10 commits behind". What is going on?
EDIT3: Added my own answer, for what I've come up with (so far).
As mentioned, there's lots of information out there for people who are using console-based Git software. However, I could not find a single piece of info on how to do this solely through Github Windows. Well, here is the solution:
Process:
Create a project page, as described here: https://help.github.com/articles/creating-pages-with-the-automatic-generator
Unfortunately, they only have console-based solution for getting a local copy. So here's how the rest of this works in Github Windows... (assumption: project name is myproject, consisting of myproject.js and myproject.css)
After the page has been created (takes a few minutes), open up Github Windows.
In Github Windows, open the repository for the project. On the top menubar it has "in sync", "master", "tools". Click on "master" and switch to "gh-pages" branch - SO example.
When you do this, the folder C:\Users\YourName\Documents\GitHub\myproject will now display the files for the "gh-pages" branch. If you click "master", in Github Windows, it will change the folder structure to once again represent the "master" branch. This is what confused me earlier, you can't see the directory structure for both branches at the same time.
Select the "master" branch in Github Windows.
In Windows Explorer, copy myproject.js and myproject.css into a separate directory (e.g., c:\temp).
Go back to Github Windows and select the "gh-pages" branch.
Go back to Windows Explorer and cut the files you put into c:\temp and paste them into a directory like C:\Users\YourName\Documents\GitHub\myproject\myproject-1.0\
Go back to Github Windows, and you'll see "2 files to be commited". Type in your commit message and click 'Commit'.
Then click 'Sync'.
You can now include these files in your webpages, using a URL like: http://yourname.github.io/myproject/myproject-1.0/myproject.js
Obviously this is a huge pain in the ass to do it this way, if you expect to be updating the source file(s) regularly. So obviously an automated approach would be most ideal. There is an answer for this on SO here, unfortunately it involves UNIX-based scripting which I have zero knowledge of (and, truthfully, no interest in learning just for this). If anyone comes up with a more efficient way of doing this, using only the GUI-based tools, I'm sure myself and many others would be interested in hearing about it.
EDIT: This solution is obviously usurping Github's intended way of doing things, as when I click on the "gh-pages" branch on the github website it tells me that it's "5 commits ahead and 11 commits behind" the master branch, even though they have the same files. So, again, if anyone else has a better GUI-based solution to this problem, I'm all ears.
Git(hub) software for Windows is the buggiest thing I've ever used (well, besides Windows itself). Back when I used Windows, I could hardly get anything to work with Git at all.
But, anyways, to answer your question, if you open a command prompt and type in
git checkout -b gh-pages
(if it complains about branch gh-pages already existing, remove the -b.)
it should switch the branch. Then, you can launch notepad++ or whatever text editor you use (you might have to do it from the terminal, I can't remember), add the file you want, and then type in (in cmd):
git add .
This recursively adds all files in the folder to Git.
git commit -m "Add file for easy user download"
This adds the commit message.
git remote add origin git#github.com:yourusername/yourrepository.git
This adds the Github repo so you can push to it
git push origin gh-pages
This pushes your changes to Github.
And, you're all set!
You might want to read this on Git branching.