Configure local php endpoint for Axios in Nuxt JS - javascript

I want to use Axios to make client AJAX calls that point to a local PHP file. Why PHP? I have tons of security-tested approaches in a simple PHP-based API that work great and don't need re-inventing.
I'm guessing I'm missing something easy, but so far I've had a wonky/hacky approach that works. Right now, I've got a PHP file in the /static folder, and in nuxt.config.js I have a proxy setup:
... other config stuff
axios: {
proxy: true
},
proxy: {
'/api': 'http://some-site.local/api.php'
}
... other config stuff
In order for the url above to resolve, I have a host entry setup via MAMP that resolves http://some-site.local to the /static directory in my Nuxt project.
So far, it works. But, it requires setting up MAMP to have the hosts entry, and when it comes to npm run build, this approach fails, because the build will take the PHP files from the /static and put them the the docroot of /dist, but that breaks the API proxy setup for Axios in nuxt.config.js.
I really don't want to install some PHP package (I've seen that Laravel has one that works with Nuxt), because the aim is just to be able to have a couple PHP files within my Nuxt project instead of a full library. Anyone have any insight on what I'm missing to make this work better?

For NUXT and PHP all in one project (small project)
Lets say there is already Node and PHP-CLI installed.
Create NUXT project:
npx create-nuxt-app my-app
Create file static/api/index.php (lets say):
<?php
header('Content-type: application/json; charset=utf-8');
$rawPaylaod = file_get_contents('php://input');
try {
$payload = json_decode($rawPaylaod, true);
} catch (Exception $e) {
die(json_encode(['error' => 'Payload problem.']));
}
echo json_encode([
'echo' => $payload,
]);
Install dependencies
npm i -D concurrently
npm i #nuxtjs/axios #nuxtjs/proxy
Update config.nuxt.js:
module.exports = {
...
modules: [
...
'#nuxtjs/axios',
'#nuxtjs/proxy',
],
...
proxy: {
'/api': {
target: 'http://localhost:8000',
pathRewrite: {
'^/api' : '/'
}
},
},
axios: {
baseURL: '/',
},
}
Update package.json:
"dev": "concurrently -k \"cross-env NODE_ENV=development nodemon server/index.js --watch server\" \"php -S 0.0.0.0:8000 static/api/index.php\"",
And it's ready.
Now locally in development API will be available thanks to proxy and after deployment just under path.

Related

We're sorry but sf doesn't work properly without JavaScript enabled. Please enable it to continue

I have a Vuejs app that makes an Axios call. I Dockerized it and used digitalocean to serve it to the cloud. App is working but after any axios call, i am getting "We're sorry but sf doesn't work properly without JavaScript enabled. Please enable it to continue." as response.
-I have tried changing the port from 8080 to 8081, running the app on incognito browser tab, changing "baseUrl" with "baseURL" and in front-end docker file i was installing libraries with npm, i also tried to use yarn but i still have this issue.
Is there any more idea about how to fix it ?
main.js file
createApp(App)
.use(store)
.use(vue3GoogleLogin, {
clientId:
"******",
})
.component("font-awesome-icon", FontAwesomeIcon)
.component("MazBtn", MazBtn)
.component("MazInput", MazInput)
.component("MazPhoneNumberInput", MazPhoneNumberInput)
.component("Datepicker", Datepicker)
// .component("VueGlide", VueGlide)
.use(router)
.mount("#app");
Frontend Docker file ;
#Base image
FROM node:lts-alpine
#Install serve package
RUN npm i -g serve
# Set the working directory
WORKDIR /app
# Copy the package.json and package-lock.json
COPY package*.json ./
# install project dependencies
RUN npm install
# Copy the project files
COPY . .
# Build the project
# Build the project
RUN npm run build
# Expose a port
EXPOSE 3000
# Executables
CMD [ "serve", "-s", "dist" ]
Backend docker file
FROM python:3.10-bullseye
# Working directory
WORKDIR /app
# Copy the dependencies
COPY ./docs/requirements.txt /app
# Install the dependencies
RUN pip3 install -r requirements.txt
# Copy the files
COPY . .
WORKDIR /app/backend
ENV FLASK_APP=app.py
# Executable commands
CMD [ "python3", "-m" , "flask", "run", "--host=0.0.0.0"]
vue.config.js file ;
const { defineConfig } = require("#vue/cli-service");
module.exports = defineConfig({
transpileDependencies: true,
devServer: {
compress: true,
host: "127.0.0.1",
proxy: {
// "/upload/img": {
// // target: "http://127.0.0.1:9000",
// target: "http://127.0.0.1:5000",
// },
"/api": {
// target: "http://127.0.0.1:9000",
target: "http://127.0.0.1:5000",
},
"/media": {
target: "http://localhost:8000",
},
"/http-bind": {
target: "https://localhost:8443",
logLevel: "debug",
},
},
// https: true,
// watchContentBase: false,
},
});
Your backend requests are being handled by the frontend app. It looks like you are relying on the development server's proxy functionality in order to forward them to the backend, but this will (and should) not be active when you deploy your app. Instead you will need another proxy that sends /api requests to your backend and other requests to your frontend.

How to host json-server in azure

I am new in the software field, please have patience with my question and mistakes of technical terms:
Premises:-
I have developed an front-end application using Angular4. The baseURL define in angular application is 'http://localhost:3000/'. My application uses restangular api to interact with json-server (I created a folder named json-server and it has db.json and public folder ). It is working perfectly fine when i start the json-server using command:
json-server json-server --watch db.json
My application is finalized and thus I created a production build. Thereafter I moved all my files from dist folder to public folder of json-server. When i start the json-server, my application works fine.
Actual problem:-
Now I wanted to host in azure. I simply copied all file/folder (db.json and public folder) from json-server folder as it is and put them in azure cloud. When azure hosting is done and I open the url in browser I got an error- "you don't have permission to view".
To rectify above error I deleted all files from azure and then I copied all files of dist folder and put them in azure cloud. Using this I could able to see the application in the browser but no images. At image there is an error- Response with status: 0 for URL: null
When I start json-server locally, everything works fine but of course when same web page open from other machines I got the same error- Response with status: 0 for URL: null
Is there any way to run json-server in azure so that all machines/mobile when accessing the url, can see proper web page without any error.
Step to step to run json-server on Azure Web App:
Open your browser and go to App Service Editor (https://<your-app-name>.scm.azurewebsites.net/dev/wwwroot/)
Run the command in the Console (Ctrl+Shift+C)
npm install json-server --save-dev
Put all file/folder (db.json and public folder) into wwwroot folder
Create a server.js with the following content
const jsonServer = require('json-server')
const server = jsonServer.create()
const router = jsonServer.router('db.json')
const middlewares = jsonServer.defaults()
server.use(middlewares)
server.use(router)
server.listen(process.env.PORT, () => {
console.log('JSON Server is running')
})
Click Run (Ctrl+F5), this will generate web.config file automatically and open your website in the browser.
You can use the following to quickly setup a mock service which can serve REST APIs off static JSON files.
json-server
Just install the NodeJS module ($npm install -g json-server). Populate a static json file in the format attached and then run the JSON server ($ json-server --watch db.json --port 3000)

Nodemon keeps restarting server

I have an express server that uses a local json file for a database. I'm using https://github.com/typicode/lowdb for getters and setters.
Currently the server keeps starting and restarting without any problems, but can't access it. Below is my Server.js file:
import express from 'express'
import session from 'express-session'
import bodyParser from 'body-parser'
import promisify from 'es6-promisify'
import cors from 'cors'
import low from 'lowdb'
import fileAsync from 'lowdb/lib/storages/file-async'
import defaultdb from './models/Pages'
import routes from './routes/index.js'
const app = express();
const db = low('./core/db/index.json', { storage: fileAsync })
app.use(cors())
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: true }));
app.use('/', routes);
app.set('port', process.env.PORT || 1337);
db.defaults(defaultdb).write().then(() => {
const server = app.listen(app.get('port'), () => {
console.log(`Express running → PORT ${server.address().port}`);
});
});
Anyone have an issue like this before? I think it has something to do with this line:
db.defaults(defaultdb).write().then(() => {
const server = app.listen(app.get('port'), () => {
console.log(`Express running → PORT ${server.address().port}`);
});
});
From the documentation:
nodemon will watch the files in the directory in which nodemon was started, and if any files change, nodemon will automatically restart your node application.
If your db's .JSON file is under the watch of nodemon, and you're constantly writing to it, your server will restart in an infinite loop thus making it inaccessible. Try moving your .JSON file outside the scope of nodemon's watch via moving it outside your directory or via some nodemon configuration (if possible).
I solved this issue from this page.
practically you just have to do
nodemon --ignore 'logs/*'
Update: the link was hijacked and has been removed.
My solution: I've added nodemonConfig in package.json file in order to stop infinite loop/restarting. In package.json:
"nodemonConfig": { "ext": "js", "ignore": ["*.test.ts", "db/*"], "delay": "2" },
"scripts": { "start": "nodemon" }
I was puzzled by a constant stream of restarts. I started with nodemon --verbose to see what was causing the restarts.
This revealed that my package.json file was the culprit. I was running my installation in a Dropbbox folder and had just removed all files from my node_modules folder and done a fresh install. Another computer that shared my Dropbox folder was running at the time, and unknown to me, it was busily updating its node_module files and updating the Dropbox copy of package.json files as it did so.
My solution turned out to be simple, I took a break and waited for Dropbox to finish indexing the node_modules folder. When Dropbox finished synching, nodemon ran without any unexpected restarts.
In my case (which is the same as the OP) just ignoring the database file worked
nodemon --ignore server/db.json server/server.js
You can use this generalized config file.
Name it nodemon.json and put in the root folder of your project.
{
"restartable": "rs",
"ignore": [".git", "node_modules/", "dist/", "coverage/"],
"watch": ["src/"],
"execMap": {
"ts": "node -r ts-node/register"
},
"env": {
"NODE_ENV": "development"
},
"ext": "js,json,ts"
}
I solved this by adding the following code to the package.json file
"nodemonConfig": {
"ext": "js",
"ignore": [
"*.test.ts",
"db/*"
],
"delay": "2"
}
}
Add this in your package.json:
"nodemonConfig": {
"ext": "js",
"ignore": [
"*.test.ts",
"db/*"
],
"delay": "2"
}
I solved this by creating a script in my package.json like this:
scripts": {
"start-continuous": "supervisor server/server.js",
},
This will work if you have supervisor installed in your global scope.
npm install supervisor -g
Now all I do is: npm run start-continuous

Set up proxy server for create react app

I have started a react application using create-react-app and ran the npm run eject script to gain access to all files. I afterwards installed express and created server.js file that sits on same level as package.json file
these are server.js file contents:
const express = require('express');
const app = express;
app.set('port', 3031);
if(process.env.NODE_ENV === 'production') {
app.use(express.static('build'));
}
app.listen(app.get('port'), () => {
console.log(`Server started at: http://localhost:${app.get('port')}/`);
})
Nothing crazy here, just setting up for future api proxies where I need to use secrets and as I don't want to expose my api.
after this I added a "proxy": "http://localhost:3001/" to my package.json file. I am now stuck as I need to figure out how to start my server correctly and use this server.js file in development mode and afterwards in production.
Ideally It would also be good if we could use more that one proxy i.e. /api and /api2
You didn't have to eject to run your server.js. You can just run it with node server.js together with create-react-app.
You can still do npm start even after ejecting to start your dev server.
To run /api1 and /api2, you just have to handle it in your server.js file and it should work just fine. You need to match the port in your server.js and the one in proxy settings inside package.json - in this case, it should be "proxy": "http://localhost:3031"

Grunt dev server to allow push states

I am trying to set up my grunt server to allow push states.
After countless google searches and reading SO posts I cannot figure out how to do this.
I keep getting errors like the one below.
Does anyone have any ideas how to fix this?
No "connect" targets found. Warning: Task "connect" failed. Use --force to continue.
It appears to me below that I have defined targets with the line
open: {
target: 'http://localhost:8000'
}
See complete code below:
var pushState = require('grunt-connect-pushstate/lib/utils').pushState;
module.exports = function(grunt) {
// Project configuration.
grunt.initConfig({
pkg: grunt.file.readJSON('package.json'),
connect: {
options: {
hostname: 'localhost',
port: 8000,
keepalive: true,
open: {
target: 'http://localhost:8000'
},
middleware: function (connect, options) {
return [
// Rewrite requests to root so they may be handled by router
pushState(),
// Serve static files
connect.static(options.base)
];
}
}
}
});
grunt.loadNpmTasks('grunt-contrib-uglify'); // Load the plugin that provides the "uglify" task.
grunt.loadNpmTasks('grunt-contrib-connect'); // Load the plugin that provides the "connect" task.
// Default task(s).
grunt.registerTask('default', [ 'connect']);
};
Push states are already included in most SPA frameworks, so you might not need this unless you're building a framework.
Angular: https://scotch.io/tutorials/pretty-urls-in-angularjs-removing-the-hashtag
React: How to remove the hash from the url in react-router
This looks like a grunt build script to compile an application to serve. So I'm not exactly sure how you'd use pushStates in the build process. You may be trying to solve the wrong problem.
Don't bother with grunt to deploy a local dev pushstate server for your SPA.
In your project directory, install https://www.npmjs.com/package/pushstate-server
npm i pushstate-server -D
Then to launch it, add a script entry in the package.json of your project:
…
"scripts": {
"dev": "pushstate-server"
}
…
This way you can now start it running npm run dev
All the requests which would normally end in a 404 will now redirect to index.html.

Categories

Resources