“gcloud app deploy” static website is skiping too many files - javascript

I have a static website and i'm working with GAE when i deploy my website some pictures are not being upload.
How can i fix this problem
Console Message :
Some files were skipped. Pass `--verbosity=info` to see which ones.
You may also view the gcloud log file, found at
[C:\Users\Maher\AppData\Roaming\gcloud\logs\2017.06.22\13.28.54.036000.log].
this is my app.yaml
runtime: php55
api_version: 1
threadsafe: true
handlers:
- url: /
static_files: www/index.html
upload: www/index.html
- url: /(.*)
static_files: www/\1
upload: www/(.*)
Folder Tree :
>Website
>www
>css
>fonts
>img
>photos
pic_1.jpg
pic_2.jpg
...
>js
blog_post.html
photos.html
index.html
app.yaml

Put all your static files in one dir say assets -> css, fonts, img, js.
Then specify it as a static directory
- url: /assets
static_dir: assets
expiration: "1h"
http_headers:
Vary: Accept-Encoding

Related

Browser-sync - proxy a domain gets HTTP error 403 - you don't have authorization to view this page

I run a gulp task using NodeJS module browser-sync as below.
=== File gulpfile.js ===
let browserSync = require('browser-sync').create();
gulp.task('browser-sync', function(){
browserSync.init( {
open: true,
injectChanges: true,
proxy: 'https://generalgulp.devsunset',
host: '192.168.1.76',
serveStatic: ['.'],
https: {
key: 'C:\\WebProjects\\GeneralGulp\\resources\\certificates\\server-generalgulp.key',
cert: 'C:\\WebProjects\\GeneralGulp\\resources\\certificates\\server-generalgulp.crt'
}
});
});
=== ===
My local project information is as below (I use latest up to current post date):
Node version: 17.1.0
NPM versions: 8.1.3
gulp: 4.0.2
NPM module browser-sync: 2.27.7
I run the browser-sync task. The output looks good.
==>
Using gulpfile C:\WebProjects\GeneralGulp\gulpfile.js
[Browsersync] Starting 'browser-sync'...
[Browsersync] Proxying: https://generalgulp.devsunset
Access URLs:
Local: https://localhost:3000
External: https://192.168.1.76:3000
UI: http://localhost:3001
UI External: http://localhost:3001
==>
I already add the SSL certificate for this domain to trusted root. I also have DNS records pointing from this domain ( https://generalgulp.devsunset ) - IP addresses ( 127.0.0.1 & 192.168.1.76)
I can access the site from both local & external address.
However, when I try to access the local resources using proxied domain ( https://generalgulp.devsunset
) , it gets an HTTP 403 :
Access to <my_custom_domain> was denied. You are not authorize to
view this page
I suppose when running my gulp "browser-sync" task, it will translate the custom domain to the https://localhost:3000 or https://192.168.1.76:3000
I have followed exactly the documents of https://browsersync.io/docs . I have also made an attempt with all solutions I could find. Those solutions led me to the gulp task that I wrote at the beginning.
I would appreciate if you can suggest me which things I should do further to troubleshoot why does my browser-sync cannot “proxy” my domain? Is there any parameter missing in my Gulp task?
Thanks !
I have modified the "proxy" parameter as below and it works when i access the proxied domain with given port:
(for my case is http(s)://generalgulp.devsunset:3000 )
`gulp.task('browser-sync', function(){
browserSync.init( {
open: true,
injectChanges: true,
proxy: 'generalgulp.devsunset',
host: '192.168.1.76',
serveStatic: ['.'],
https: {
key: 'C:\\WebProjects\\GeneralGulp\\resources\\certificates\\server-generalgulp.key',
cert: 'C:\\WebProjects\\GeneralGulp\\resources\\certificates\\server-generalgulp.crt'
}
});
});
`
This is a temporary acceptable solution regarding to the current question scope.
However, What i expect is the browser-sync will auto-forward traffic from custom domain ( http(s)://generalgulp.devsunset ) to : ( http://192.168.1.76:3000 ).
Does browser-sync allow users to do it ?

Zlib incorrect header check when decompressing

I need to receive a zip file from a server, extract it and pipe its contents elsewhere.
However when trying to extract it with createInflate from the zlib builtin package, I get the error Error: incorrect header check
(I have tried with createUnzip and createGunzip too)
Downloading the file with cUrl and extracting it with the unzip linux command works correctly.
$ unzip report.zip
Archive: report.zip
inflating: report.csv
$ zipinfo -v report.zip
[...]
file system or operating system of origin: MS-DOS, OS/2 or NT FAT
version of encoding software: 2.0
minimum file system compatibility required: MS-DOS, OS/2 or NT FAT
minimum software version required to extract: 2.0
compression method: deflated
compression sub-type (deflation): normal
file security status: not encrypted
extended local header: yes
[...]
Code used to extract the already downloaded file:
const pipeline = promisify(stream.pipeline);
(async () => {
const unzipper = createInflate();
const sourceStream = fs.createReadStream('report.zip');
const destStream = fs.createWriteStream('report.csv');
await pipeline(sourceStream, unzipper, destStream);
})();
Note that the error is the same between piping the response directly and piping the result of createReadStream
Full zipinfo -v:
$ zipinfo -v report.zip
Archive: report.zip
There is no zipfile comment.
End-of-central-directory record:
-------------------------------
Zip archive file size: 527 (000000000000020Fh)
Actual end-cent-dir record offset: 505 (00000000000001F9h)
Expected end-cent-dir record offset: 505 (00000000000001F9h)
(based on the length of the central directory and its expected offset)
This zipfile constitutes the sole disk of a single-part archive; its
central directory contains 1 entry.
The central directory is 78 (000000000000004Eh) bytes long,
and its (expected) offset in bytes from the beginning of the zipfile
is 427 (00000000000001ABh).
Central directory entry #1:
---------------------------
report_SMS_1f7c2069_20200730.csv
offset of local header from start of archive: 0
(0000000000000000h) bytes
file system or operating system of origin: MS-DOS, OS/2 or NT FAT
version of encoding software: 2.0
minimum file system compatibility required: MS-DOS, OS/2 or NT FAT
minimum software version required to extract: 2.0
compression method: deflated
compression sub-type (deflation): normal
file security status: not encrypted
extended local header: yes
file last modified on (DOS date/time): 2020 Jul 30 11:05:48
32-bit CRC value (hex): 5abe6238
compressed size: 349 bytes
uncompressed size: 934 bytes
length of filename: 32 characters
length of extra field: 0 bytes
length of file comment: 0 characters
disk number on which file begins: disk 1
apparent file type: binary
non-MSDOS external file attributes: 000000 hex
MS-DOS file attributes (00 hex): none
There is no file comment.
zlib is not zip. zip is not zlib. They are two different formats. gzip is yet another. (The use of the word "unzip" in the node.js zlib interface is misleading.)
You need something that unzips zip files. Take a look at this.

Gulp webserver unable to load a cachebusted JS resource

I have bunch of JS files that have been versioned during deployment for JS cache-busting. It looks like this.
<script src="dist/js/bundle.js?v=ju29jj39983eddd2"></script>
I perform minification & compression using gulp. After it is done, I will save them to a local dir using a filename appended with version values. Here's the code.
gulp.task('bundle', function() {
return gulp
.src(config.app_scripts) // app_scripts is an array containing list of files
.pipe(gutil.env.type === 'production' ? uglify({mangle: true}) : gutil.noop())
.pipe(gutil.env.type === 'production' ? concat('b-bundle.js?v=' + secureRand) : concat('b-bundle.js'))
.pipe(gulp.dest('dist/js'));
});
I serve the assets in the development environment using gulp-webserver. Here's the configuration. However, it doesn't pick the JS file the directory. It just fallbacks to index.html when page loads.
//Local webserver
gulp.task('webserver', function() {
gulp.src(__dirname + '/client')
.pipe(webserver({
livereload: false,
open: false,
directoryListing: false,
fallback: 'index.html',
proxies: proxiesConf
}));
});
I'm not sure what is causing this behavior. I highly appreciate if somebody can help me.
Nowadays, it's discouraged to cache-bust with query strings according to:
Most proxies, most notably Squid up through version 3.0, do not cache resources with a "?" in their URL even if a Cache-control: public header is present in the response. To enable proxy caching for these resources, remove query strings from references to static resources, and instead encode the parameters into the file names themselves.
-- https://gtmetrix.com/remove-query-strings-from-static-resources.html
Instead, you should (a) let the webserver invalidate cache by adding to the header: Cache-Control: no-cache, no-store, must-revalidate, or (b) adding a content hash to the file name of the resource.
<script src="assets/js/edf-d41d8cd98f00b204e9800998ecf8427e.min.js" />
instead of
<script src="assets/js/edf.min.js" />
-- https://medium.com/#codebyamir/a-web-developers-guide-to-browser-caching-cc41f3b73e7c
Good luck :)

Sinatra-assetpack not merging js files

I've been trying to compile the assets in a remote server with the sinatra-assetpack gem with not luck so far. It does compile when deploying it to Heroku or in my local machine but not in the remote server.
I have this configuration in my sinatra app file:
assets do
serve '/js', :from => 'assets/javascripts'
serve '/css', :from => 'assets/stylesheets'
serve '/images', from: 'assets/images'
serve '/bower_components', from: 'bower_components'
js :landing, [
'/bower_components/sweetalert/lib/sweet-alert.min.js',
'/js/back-to-top.js',
'/js/subscription.js'
]
js :checkout, [
'/js/form.js',
'/js/vendor/*.js'
]
css :landing, [
'/bower_components/sweetalert/lib/sweet-alert.css',
'/css/normalize.css',
'/css/landing.css'
]
css :checkout, [
'/css/normalize.css',
'/css/checkout.css',
'/css/vendor/animate.css'
]
js_compression :jsmin
css_compression :sass
end
When I execute rake assetpack:build, all the files are correctly compiled except the checkout.js. It does indeed generate a public/assets/javascripts/checkout.js and its fingerprinted version but both of them contains just a <h1> Internal Server Error </h1>.
Removing the form.js (which is in reality a coffescript named form.coffee) from the precompilation process outputs a correctly compiled checkout.js. What is driving me nuts is that the form.coffee is correctly converted from coffee to javascript (I can see it at public/js/form.js) but it seems it cannot be merged with the vendor files.

404 error while loading js files

I am using webapp2 to build a webapp with angularjs. This is the directory structure.
|--test-app
|--lib
|--static
|--js
|--app.js
|--controller.js
|--lib
|--angular
|--angualr-bootstrap
|--index.html
|--app.yaml
|--mainapp.py
But when I try to load the js files in index.html
<!DOCTYPE html>
<html lang="en" ng-app="testApp">
<head>
<script src="/static/js/app.js"></script>
<script type="text/javascript" src="/static/js/controller.js"></script>
</head>
<body>
<div ng-controller="MainController">
IN MAIN
</div>
</body>
</html>
I get these errors:
GET http://localhost:8080/static/js/app.js (404 - file not found)
GET http://localhost:8080/static/js/controller.js (404 - file not found)
I cannot figure out why I am getting these errors.
Here is code for app.yaml
application: test-app
version: 1
runtime: python27
api_version: 1
threadsafe: true
handlers:
- url: /.*
script: mainapp.app
libraries:
- name: webapp2
version: "2.5.2"
Here is code for mainapp.py
class Start(webapp2.RequestHandler):
def get(self):
self.response.headers['Content-Type'] = 'text/html'
self.response.write(open('static/index.html').read())
app = webapp2.WSGIApplication([
('/', Start),
], debug=True)
You must explicitly declare the location of your static content in app.yaml:
handlers:
- url: /static
static_dir: static
handlers:
- url: /.*
script: mainapp.app
See docs for details:
Unlike a traditional web hosting environment, Google App Engine does not serve files directly out of your application’s source directory unless configured to do so.
URL handler path patterns are tested in the order they appear in app.yaml, from top to bottom. In this case, the /static pattern will match before the /.* pattern will for the appropriate paths.
I moved the index.html out of the static folder and then declared explicitly the static url, as said by #Selcuk. And it works nows.

Categories

Resources