Grunt dev server to allow push states - javascript

I am trying to set up my grunt server to allow push states.
After countless google searches and reading SO posts I cannot figure out how to do this.
I keep getting errors like the one below.
Does anyone have any ideas how to fix this?
No "connect" targets found. Warning: Task "connect" failed. Use --force to continue.
It appears to me below that I have defined targets with the line
open: {
target: 'http://localhost:8000'
}
See complete code below:
var pushState = require('grunt-connect-pushstate/lib/utils').pushState;
module.exports = function(grunt) {
// Project configuration.
grunt.initConfig({
pkg: grunt.file.readJSON('package.json'),
connect: {
options: {
hostname: 'localhost',
port: 8000,
keepalive: true,
open: {
target: 'http://localhost:8000'
},
middleware: function (connect, options) {
return [
// Rewrite requests to root so they may be handled by router
pushState(),
// Serve static files
connect.static(options.base)
];
}
}
}
});
grunt.loadNpmTasks('grunt-contrib-uglify'); // Load the plugin that provides the "uglify" task.
grunt.loadNpmTasks('grunt-contrib-connect'); // Load the plugin that provides the "connect" task.
// Default task(s).
grunt.registerTask('default', [ 'connect']);
};

Push states are already included in most SPA frameworks, so you might not need this unless you're building a framework.
Angular: https://scotch.io/tutorials/pretty-urls-in-angularjs-removing-the-hashtag
React: How to remove the hash from the url in react-router
This looks like a grunt build script to compile an application to serve. So I'm not exactly sure how you'd use pushStates in the build process. You may be trying to solve the wrong problem.

Don't bother with grunt to deploy a local dev pushstate server for your SPA.
In your project directory, install https://www.npmjs.com/package/pushstate-server
npm i pushstate-server -D
Then to launch it, add a script entry in the package.json of your project:
…
"scripts": {
"dev": "pushstate-server"
}
…
This way you can now start it running npm run dev
All the requests which would normally end in a 404 will now redirect to index.html.

Related

Configure local php endpoint for Axios in Nuxt JS

I want to use Axios to make client AJAX calls that point to a local PHP file. Why PHP? I have tons of security-tested approaches in a simple PHP-based API that work great and don't need re-inventing.
I'm guessing I'm missing something easy, but so far I've had a wonky/hacky approach that works. Right now, I've got a PHP file in the /static folder, and in nuxt.config.js I have a proxy setup:
... other config stuff
axios: {
proxy: true
},
proxy: {
'/api': 'http://some-site.local/api.php'
}
... other config stuff
In order for the url above to resolve, I have a host entry setup via MAMP that resolves http://some-site.local to the /static directory in my Nuxt project.
So far, it works. But, it requires setting up MAMP to have the hosts entry, and when it comes to npm run build, this approach fails, because the build will take the PHP files from the /static and put them the the docroot of /dist, but that breaks the API proxy setup for Axios in nuxt.config.js.
I really don't want to install some PHP package (I've seen that Laravel has one that works with Nuxt), because the aim is just to be able to have a couple PHP files within my Nuxt project instead of a full library. Anyone have any insight on what I'm missing to make this work better?
For NUXT and PHP all in one project (small project)
Lets say there is already Node and PHP-CLI installed.
Create NUXT project:
npx create-nuxt-app my-app
Create file static/api/index.php (lets say):
<?php
header('Content-type: application/json; charset=utf-8');
$rawPaylaod = file_get_contents('php://input');
try {
$payload = json_decode($rawPaylaod, true);
} catch (Exception $e) {
die(json_encode(['error' => 'Payload problem.']));
}
echo json_encode([
'echo' => $payload,
]);
Install dependencies
npm i -D concurrently
npm i #nuxtjs/axios #nuxtjs/proxy
Update config.nuxt.js:
module.exports = {
...
modules: [
...
'#nuxtjs/axios',
'#nuxtjs/proxy',
],
...
proxy: {
'/api': {
target: 'http://localhost:8000',
pathRewrite: {
'^/api' : '/'
}
},
},
axios: {
baseURL: '/',
},
}
Update package.json:
"dev": "concurrently -k \"cross-env NODE_ENV=development nodemon server/index.js --watch server\" \"php -S 0.0.0.0:8000 static/api/index.php\"",
And it's ready.
Now locally in development API will be available thanks to proxy and after deployment just under path.

How do I set up AWS Cloud9 to run an existing JavaScript app with webpack-dev-server (in development mode)?

I am trying to get my fairly typical JavaScript (React) app to run in dev mode on AWS Cloud9. I successfully cloned my repo (using https ugh), installed my npm packages, and can run scripts in the console. However, I don't know how to run and access the app in dev mode. There are a plethora of docs but they all seem to dance around the running part. My guess is I need to somehow set a custom host and port, but I also need to find what URL to use to see the app running.
Here is my devServer config:
devServer: {
// Display only errors to reduce the amount of output.
stats: "errors-only",
host, // Defaults to `localhost`
port, // Defaults to 8080
overlay: {
errors: true,
warnings: true,
},
}
If anyone comes across this, I wanted to share my solution because I know how frustrating this can be:
First, create a script in your package.json file:
"start": "webpack-dev-server --open"
Then, add the following to your Webpack config file:
devServer: {
contentBase: path.join(__dirname, 'dist'),
host: '0.0.0.0',
port: 8080,
compress: true,
}
Then, open the terminal in AWS Cloud 9, and run the script:
npm start
Finally, click on the link in the terminal: "Project is running at http://0.0.0.0:8080/" and your app will show in a new window.
**If it doesn't work, don't forget to allow port 80 on your Cloud 9 Security Group: https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/working-with-security-groups.html#adding-security-group-rule
If you want to view the project in the preview pane, you can add the following to your devServer config:
disableHostCheck: true,
However, it's important to note that when set to true, this option bypasses host checking. THIS IS NOT RECOMMENDED as apps that do not check the host are vulnerable to DNS rebinding attacks.
1) First thing you need to do is to run react app on port 8080. You can do this by setting environment variable PORT to 8080 and then just starting react dev server from AWS Cloud9 terminal.
export PORT=8080
npm start
For details look at this discussion on GitHub.
2) After starting your application you can preview it by clicking Preview -> Preview Running Application at the top of AWS Cloud9.
For more details check this AWS Cloud9 doc
In webpack.config.js:
devServer: {
historyApiFallback: true,
contentBase: './',
host: process.env.IP,
//https: true,
port: process.env.PORT,
"public": "your-project.c9users.io" //no trailing slash
},
Refer Link

How to only run a gulp.task locally?

I have the following gulp function:
// TODO: Comment for production.
gulp.task('startServer', function() {
return connect.server({
root: './dist',
port: 8080
});
});
Every time I pull it to work on it locally, I have to uncomment the code and then comment it back when I push to prod. I have to do something similar to this in a few files. Is there a clever way to avoid this hassle and being able to pull/push code without having to comment/uncomment all of this for every single branch I work on?
Thanks.
You don't need to use gulp code to start server . You can run local and production server using express nodejs.
On your production server, the NODE_ENV environment variable should be set to production (NODE_ENV=production). So you can add a conditional to your gulp file to check whether you are running it on the production server or not:
if (process.env.NODE_ENV !== 'production') {
gulp.task('startServer', function() {
return connect.server({
root: './dist',
port: 8080
});
});
}

How to run Angular 2 app without browser-sync and node?

I have created an angular2 app with typescript. I am using Angular2 Routes for routing.
Using lite-server to start my angular2 app, it is working fine and routing properly if page is refreshed.
ISSUE::
But once i deployed the ts-compiled code to my domain which uses http-server to serve the files, it stop routing properly.
Whenever I refresh my page on my domain(blog.jyotirmaysenapati.com), it shows below thing::
Not Found
The requested URL /blog/new was not found on this server.
Additionally, a 404 Not Found error was encountered while trying to
use an ErrorDocument to handle the request.
Please help on this as i do not want to have node support in my domain. So how can i run it properly without the help of node and browser-sync??
Is it possible in first case??
I am using latest version of angular2 framework.
Anyone can see my code here.
You can use gulp-connect which provide fallback option :-
var gulp = require('gulp'),
connect = require('gulp-connect');
gulp.task('connect',function() {
connect.server({
root: '',
port: 8080,
fallback: 'index.html',
middleware: function(connect) {
return [connect()
.use('/node_modules', connect.static('./node_modules')), connect()
.use('/app', connect.static('./dist'))
];
}
});
})

GruntFile automated deployment using ssh (Windows)

I am trying to set up a GruntFile.js file to automate the process of logging on to my personal website's server via ssh and pulling the latest version of the git repo. The relevant part of my grunt file looks like this:
sshconfig: {
portfolioServer: {
host: 'mySite.com',
username: 'root',
agent: process.env.SSH_AUTH_SOCK,
}
},
sshexec:{
deploy:{
command: [
'cd portfolio',
'git pull'
].join(' && ')
},
options:{
config: 'portfolioServer'
}
},
However, when I run the associated task (I named it "grunt deploy"). I get the following error.
Running "sshexec:deploy" (sshexec) task
Warning: Connection :: error :: Error: Authentication failure. Available authentication methods: publickey,password Use --force to continue.
Aborted due to warnings.
My understanding is that this error means that I have not set up the public/private ssh keys correctly. However, I have already gone through the process of setting up public/private keys.I am already able to run the following command through git bash and log in successfully:
ssh root#mySite.com
I have searched online for this problem and it seems like it might have something to do with process.env.SSH_AUTH_SOCK not behaving in git bash on windows in the same way that it might be expected to be have in a native linux distribution.
What further steps in my setup do I have to take in order to make this deployment configuration work?

Categories

Resources