How can I split my Vue js project into development and test? - javascript

I want to divide my vuejs frontend project into two parts as development and test. In the development part, I want to work in my local and request the example:8010 urline, and in the test part, I want to send a request to the example:80 address. How can I do this, I did a source search but couldn't find anything.
Example of a request I wrote:
var formData = new FormData();
formData.append('file', this.image[0]);
await axios
.post('example', formData, {
headers: {
Authorization: `Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJlbWFpbCI6fdsaeqImNoZ3VuYXkzQGdtYWlsLmNvbqSIsImlkIjoxLCJ0eXBlIjoxLCJpYXQiOjE2NDU0NDUxMjR9.Kg8NcFiAKtBHxkQsRwl2pO6svp7SDQSQw13SJ4xe1vc`,
},
})
.then((response) => {
console.log(response.data.id);
})
.catch((error) => {
console.log(error);
});

Serkan
You can use .env files for your environments. You can add dotenv package your project, or you can give parameters inside of your scripts part of you package.json
here you can follow this page to use your env variables inside vue app. Keep in mind you must start VUE_APP for your variables:
VUE_APP_BASE_API = 'yourapplink/api'
You should create 2 .env files 1 for your local development other for your production and you can add as much as you like.
You can create .env.development and put your variables in it then start your vue app with:
vue-cli-service build --mode development
And you should see app using your .env.development file.

Related

Autodesk Forge web application - from visual studio code to close .exe file

I have a working forge application ( bim360 hub sidebar with forge viewer and some charts).
It is currently running from Visual Studio Code IDE only. I want to build the app into an .exe file in order to be able to send it to a user, upload it to a server with IIS, etc..
General details:
I used Petr Broz tutorial to set up the backend of the viewer and hub
(Forge online training - view your models https://www.youtube.com/watch?v=-O1e3gXCOEQ&t=8986s )
The app is running on Node.js
I tried to use 'nexe' module and build executable file. With this method, I need to specify index.js file ("an entry point") and define a 'nexe.config.js' file. I used the entry point start.js.
Eventually, I managed to create an exe file - and when I run it from the command line, I get an error
Missing FORGE_CLIENT_ID or FORGE_CLIENT_SECRET env. variables.
although I have them in the config.js
Main questions:
Is there another way to build a close exe file from visual studio code - for a forge web application?
Am i doing something wrong with the processes I mention above?
Is it even possible to deploy a web application to IIS using an exe file?? all of the documentation points toward Azur, AWS and heroku..
Relevant files:
1) start.js:
const path = require('path');//bringing in built in node js modeules ( to resulve file system path )
const express = require('express');//module to create the express server
const cookieSession = require('cookie-session');
//any piece of code would have an opportunity to handle the request
const PORT = process.env.PORT || 3000;
const config = require('./config.js');
if (config.credentials.client_id == null || config.credentials.client_secret == null) {
console.error('Missing FORGE_CLIENT_ID or FORGE_CLIENT_SECRET env. variables.');
return;
}
let app = express();
//static middlewere to check for the front end files (html,js,css)
app.use(express.static(path.join(__dirname, 'public')));//method inside express module: a middlewere for serving static files this line will check in 'public' folder if the request
//that is sent (specific file) is in there. if so - it will ignore the rest of the stack(the rest of the code)
app.use(cookieSession({
// create 2 cookies that stores the name and encripted key
name: 'forge_session',
keys: ['forge_secure_key'],//takes cater of decipher the encription for the forge key for us
maxAge: 14 * 24 * 60 * 60 * 1000 // 14 days, same as refresh token
}));
app.use(express.json({ limit: '50mb' }));//middlewere that looks at the title of the request - and if its .json it will look at the body of the request and parese it to javascript object
app.use('/api/forge', require('./routes/oauth.js'));//adding our custom express routers that will handle the different endpoints.
app.use('/api/forge', require('./routes/datamanagement.js'));
app.use('/api/forge', require('./routes/user.js'));
app.use((err, req, res, next) => {
console.error(err);
res.status(err.statusCode).json(err);
});
app.listen(PORT, () => { console.log(`Server listening on port ${PORT}`); });
2) config.js:
// Autodesk Forge configuration
module.exports = {
// Set environment variables or hard-code here
credentials: {
client_id: process.env.FORGE_CLIENT_ID,
client_secret: process.env.FORGE_CLIENT_SECRET,
callback_url: process.env.FORGE_CALLBACK_URL
},
scopes: {
// Required scopes for the server-side application-->privliges for our internal opperation in the server side ("back end")
internal: ['bucket:create', 'bucket:read', 'data:read', 'data:create', 'data:write'],
// Required scope for the client-side viewer-->priveliges for the client ("front end")
public: ['viewables:read']
}
};
Author of the tutorial here :)
I'm not sure how nexe works exactly but please note that the sample app expects input parameters such as FORGE_CLIENT_ID or FORGE_CLIENT_SECRET to be provided as environment variables.
As a first step, try running your *.exe file after setting the env. variables in your command prompt.
If that doesn't work, try hard-coding the input parameters directly into the config.js file (replacing any of the process.env.* references), and then bundle everything into an *.exe file. This is just for debugging purposes, though! You shouldn't share your credentials with anyone, not even inside an *.exe file. So as an alternative I'd suggest that you update the sample app to read the input parameters from somewhere else, perhaps from a local file.
after trying a lot of solutions, i got to the conclusion that the reason that nothing happened was that the oathantication files ( with the clint_id and clint_password) was not embedded in the .exe file.
the way to include those files with the nexe module is to use the flag -r "Foldername/subfoldername/filename.js".
first, crate a nexe.config.js file that would contain the entry point file name to the app. ( in my case, the file name is " start.js")
second, write the following commands in the command line:
cd C:\Projects\MyAppFolder
npm install -g nexe
// specify all the files you want to include inside the exe file
nexe start.js -r "config.js" -r "nexe.config.js" -r "routes/common/oauth.js" -r "routes/*.js" -r "public//." -r ".vscode/**/." -r "package-lock.json" -r "package.json" --build --output "AppName.exe"

Docker compose network not resolving hostname in javascript http request

I am currently writing a small fullstack application using Docker compose, vue, and python. All my containers work in isolation, but I can't seem to get my containers to communicate using host names... Here's my code:
Docker Compose
version: "3.8"
services:
web:
build: ./TranscriptionFrontend
ports:
- 4998:4998
api:
build: ./TranscriptionAPI
Javascript Frontend Request
fetch("http://api:4999/transcribe", {
method:'POST',
headers: {'Content-Type': 'application/json'},
body: JSON.stringify(data)
}).then(res => {
res.json().then((json_obj) =>{
this.transcription_result = json_obj['whisper-response']
})
}).catch(e => {
this.transcription_result = "Error communicating with api: " + e;
})
I know my API service works because originally I was just mapping it to a port on my localhost, but that got messy and I want to keep access to it within my docker container. In all cases the host name could not be resolved from my JS request. Also, curl-ing from my containers using host names does provide a response i.e. docker-compose exec web curl api or vice versa. I'm a beginner to java script and docker so apologies if I'm missing something basic.
What I've tried:
XML Http Request
Making call without using http://
Your docker service name won't reflect in the url unless you configure some virtual hosts that points to its image on localhost.
You can look into something like How to change the URL from "localhost" to something else, on a local system using wampserver? if that is what you really want to do but I feel it might be defeating the point.
My approach would be to pass the API url as environment variable to your front-end service directly in the Docker compose file. Something like:
services:
web:
build: ./TranscriptionFrontend
ports:
- 4998:4998
environment:
API_URL: http://127.0.0.1:4999
api:
build: ./TranscriptionAPI
ports:
- 4999:4999
and then inject the env variable into your Javascript app build.
For example assuming you're using any Node.js stack for compiling your JS you could have it as a process.env property and then do:
fetch(`${process.env.API_URL}/transcribe`, { ... })
Host names for docker container communication are only possible if you define a user defined bridge network. Here's an example:
(On your docker-compose.yml)
...
networks:
my_network:
driver: bridge
Here's a link to the Docker docs just in case. Let me know if this helped!

React Axios post request is getting timed out after 4 minutes

Building React app. Using Axios to upload file. Simple post request with FormData and multipart/form-data header is getting time out after 4 minutes in Chrome and 2 minutes in Safari.
As backend I'm using Django.
In dev environment I'm using proxy parameter in package.json to connect to backend via localhost.
As production environment I'm building static files via npm run build command and Django is serving them directly without any proxy like nginx.
The issue is present in local/dev and production environment.
Need help as I'm getting insane here, as any work around is not working.
Thanks in advance.
loadData = () => {
let formData = new FormData();
formData.append('file', this.state.file);
let path = `/path`
this.setState({loading: true});
axios(path, {
method: 'POST',
headers: {
'Content-Type': 'multipart/form-data'
},
data: formData
})
Ok!
It was quite a while since post but I figured it out. Apparently it was timeout configured on Gunicorn it was exactly 120 sec. Why it was longer on Chrome? No idea but this difference between Chrome and rest of the browser threw me in the wrong direction.
So yea, fixed.

create-react-app exclude/include/change code parts at build

I'm developing a React web app and I'm using the create-react-app npm utility.
My app communicates with a server which, during development, is on my local machine. For this reason all the Ajax request I make use a localhost:port address.
Of course, when I'm going to build and deploy my project in production I need those addresses to change to the production ones.
I am used to the preprocess Grunt plugin flow (https://github.com/jsoverson/grunt-preprocess) which has the possibility to mark parts of code to be excluded, included or changed at build time.
For example:
//#if DEV
const SERVER_PATH = "localhost:8888";
//#endif
//#if !DEV
const SERVER_PATH = "prot://example.com:8888";
//#endif
Do you know if there is a way to do such thing inside the create-react-app development environment?
Thank you in advance!
I'm not too sure exactly how your server-side code handles requests, however you shouldn't have to change your code when deploying to production if you use relative paths in your ajax queries. For example, here's an ajax query that uses a relative path:
$.ajax({
url: "something/getthing/",
dataType: 'json',
success: function ( data ) {
//do a thing
}
});
Hopefully that helps :)
When creating your networkInterface, use the process.env.NODE_ENV to determine what PATH to use.
if (process.env.NODE_ENV !== 'production') {
const SERVER_PATH = "localhost:8888";
}
else {
const SERVER_PATH = "prot://example.com:8888";
}
Your application will automatically detect whether you are in production or development and therefore create the const SERVER_PATH with the correct value for the environment.
According to the docs, the dev server can proxy your requests. You can configure it in your package.json like this:
"proxy": "http://localhost:4000",
Another option is to ask the browser for the current location. It works well when your API and static files are on the same backend, which is common with Node.js and React.
Here you go:
const { protocol, host } = window.location
const endpoint = protocol + host
// fetch(endpoint)

Loading weather API data into electron App

I started a project with my raspberry pi running an electron App where I need to get the actual weather from an open weather API. I am totally new to electron and not that experienced in Javascript. So I am stuck with getting the Data from the weather API into the App. I can request the Data as JSON or XML data. I tried out different ways I thought it might work but they all failed. So could someone tell me how to get API Data into electron in general?
The easiest way to start with API requests is to use axios.
After setting up the project (you can follow Getting Started), follow these steps:
Install Axios npm install --save axios
Create main.js in your project's folder.
Load main.js inside index.html somewhere before </body>.
Put the JavaScript code inside main.js
const axios = require('axios');
function fetchData() {
// you might need the next line, depending on your API provider.
axios.defaults.headers.post['Content-Type'] = 'application/json';
axios.post('api.example.com', {/* here you can pass any parameters you want */})
.then((response) => {
// Here you can handle the API response
// Maybe you want to add to your HTML via JavaScript?
console.log(response);
})
.catch((error) => {
console.error(error);
});
}
// call the function to start executing it when the page loads inside Electron.
fetchData();

Categories

Resources