I am using Quasar v2 for my Electron app development.
In this version, native node modules will no longer work on the renderer processes.
To get around this, there is electron-preload.js where we can access these node methods and expose them to the other parts of the app.
So I did this to expose fs:
import { contextBridge } from 'electron'
contextBridge.exposeInMainWorld('electronFs', require('fs'))
Then on my Vue/js files:
const fs = window.electronFs || require('fs')
fs.lstatSync(somePath).isFile()
But then I am getting fs.lstatSync(...).isFile is not a function. Any thoughts how can I make this work?
This is just for a local application (no web connectivity).
Thanks!
By default, Main process doesn't share the same context as Renderer's. And Preload runs in a context of Renderer's but isolated. Instead of exposing native Nodejs APIs to Renderer insecurely, even though you're building a local app, you may still use IPC to expose limited interfaces to Preload of Renderer process, then use contextBridge to expose APIs to Renderer. Here are some demo snippets.
/* Main.js */
const fs = require('fs');
const {ipcMain} = require('electron');
ipcMain.handle('isFile', (path) =>
fs.lstatSync(path).isFile()
)
/* Preload.js */
import {ipcRenderer, contextBridge} from "electron";
contextBridge.exposeInMainWorld('myAPIs', {
isFile: (path) => ipcRenderer.invoke('isFile')
}
/* index.vue */
myAPIs.isFile('path-to-file');
That happens because, in Electron, the backend and the UI don't share the same process (they run in different threads) because if you allow exposing native node APIs into the user context (the UI that runs finally in Chromium), it means that any time external javascript is loaded, they can access all those native APIs in the user computer.
So, those variables get transported to one thread to the other using IPC (Inter-process Communication), is just data, need to be primitives, if not they get serialized to primitives in the end (types of variables with no methods or objects like string, numbers, dates, or objects as string or binary), so probably to send those object from the parent context to the UI context Electron probably removes all non-serializable properties (so no functions).
Don't do that, is extremely insecure, just move data from one side to the other, which is very similar to using WebSockets.
Related
I have tried importing my Kafka library inside page.svelte, and it has thrown an error saying it is not allowed to import Kafka inside the script.
<script>
import Table from "$lib/Table.svelte";
import {Kafka, logLevel} from 'kafkajs';
</script>
Later I found we can export the kafka producer by setting the kafka import in hooks.server.js. I imported the kafka library in hooks.server.js and added the producer in event.locals. Later exported it to my script tag of page.svelte.
export let producer;
Still, the value of producer is undefined. Any leads on how to expose the imports from hooks.server.js to svelte.page could be much appreciated.
You can't do that at all, as the library is not meant for the browser and data transferred to the page has to be serialized which is impossible here.
From the FAQ:
Can KafkaJS be used in a browser?
No - KafkaJS is a library for NodeJS
and will not work in a browser context.
Communication with Kafka happens over a TCP socket. The NodeJS API
used to create that socket is net.connect/tls.connect. There is no
equivalent API in browsers as of yet, which means that even if you run
KafkaJS through a transpilation tool like Browserify, it cannot
polyfill those modules.
You could expose the functionality via custom API endpoints instead of trying to use it directly.
How one would transfer locals otherwise:
Locals are not exposed automatically, they are only available in server functions. One can pass locals from the server to the page e.g. in the top +layout.server.js load function:
export const load = ({ locals }) => {
return { value: locals.value };
};
Data returned from this function is available to all pages using the layout via the data property.
I'm trying to develop a desktop app which would need to make a few private API calls, authenticated using some secret keys.
The keys are created for me by external IT service providers outside of my organisation - they are responsible for the security so there are a few constraints:
They said even though they have already taken steps on their end to secure the API and there are mitigation strategies in place even if a breach happens, but still they would like to make sure that I treat the keys with a security-conscious mindset and take whatever steps possible on my end as well to make sure they remain secured.
I'm not allowed to just create random middleware / gateway on a private server or serverless platform to perform the API calls on my app's behalf as these calls may contain business data.
I have done some research and from what I can find, the general recommendation is to set up a ".env" file in the project folder and use environment variables in that file to store the API keys.
But upon reading the Vue CLI documentation I found the following:
WARNING
Do not store any secrets (such as private API keys) in your app!
Environment variables are embedded into the build, meaning anyone can
view them by inspecting your app's files.
So, given the constraints, is there a way to store these keys securely in a Vue CLI 4 + Electron Desktop app project?
Thanks.
In general, especially if you have a lot of environment variables, it would be better practice to store environment variables in a dot env file (.env), however, it's possible that this file could be leaked when you package your electron app. So, in this case it would be better to store your environment variables from the terminal/command line. To do so follow this guide (https://www.electronjs.org/docs/api/environment-variables).
Keep in mind anything that requires the API key/private information try to keep it on the backend, i.e., the electron process and send the results to the Vue front end.
Here's an example of how you could implement this:
On windows from CMD:
set SOME_SECRET="a cool secret"
On POSIX:
$ export SOME_SECRET="a cool secret"
Main process:
// Other electron logic
const { ipcMain } = require("electron");
// Listen for an event sent from the client to do something with the secret
ipcMain.on("doSomethingOnTheBackend", (event, data) => {
API.post("https://example.com/some/api/endpoint", {token: process.env.SOME_SECRET, data});
});
Client side:
const { ipcRenderer } = require("electron");
ipcRenderer.send("doSomethingOnTheBackend", {username: "test", password: "some password"});
Also note, to use the ipcRenderer on the client side nodeIntegration needs to be enabled.
Here are some more resources to help you get started:
https://www.electronjs.org/docs/api/ipc-renderer
https://www.electronjs.org/docs/api/ipc-main
I have a file called utils.js which has some constants and I'm building a socket app which uses Node for backend work and regular Javascript for frontend. The names of the socket events are identical in both frontend and backend so I want to use the constants both in backend and frontend. But the syntax for exporting/importing is different in ecmascript6 as opposed to Node so how do I pull this off?
This is the contents of utils.js:
const EVENTS = {
CONNECTION: 'connection',
MESSAGE: 'message'
};
Object.freeze(EVENTS);
module.exports = EVENTS;
You can put the objects in JSON file, like events.json, or even event.js is you want plain JS objects. Both the Node and JS have access to this. Then you require/import this file in utils.js and you can do the same on the front end. Front and back ends can handle these however they want.
I'm trying to use serialport.js, even just including it at the moment causes a runtime error TypeError: undefined is not an object (evaluating 'stream._ansicursor').
var React = require('react')
var SerialPort = require('serial port')
The context of this is within a react app, so its being compiled from jsx. I have no idea what or how this error came about, looking in at the line given in the error its coming from this code thats coming from serial port.js.
/**
* Creates a Cursor instance based off the given `writable stream` instance.
*/
function ansi (stream, options) {
if (stream._ansicursor) {
return stream._ansicursor
} else {
return stream._ansicursor = new Cursor(stream, options)
}
}
module.exports = exports = ansi
The serialport package you are trying to use operates on nodejs streams, this means it will need to run on the server within the nodejs context.
Depending on what you're trying to do with the serial port package, if you want to continue to have a React based web UI for this task, you will need to separate out the serialport actions and write an API using a nodejs framework. I'd suggest something like ExpressJS.
This way you can send requests to API urls that will perform the serialport tasks on the server and return JSON feedback that your web application, written in React, can interact with.
I'm looking for either a reference or an answer to what I think is a very common problem that people who are current implementing JavaScript MVC frameworks (such as Angular, Ember or Backbone) would come across.
I am looking for a way or common pattern to externalize application properties that are accessible in the JS realm. Something that would allow the javascript to load server side properties such as endpoints, salts, etc. that are external to the application root. The issue that I'm coming across is that browsers do not typically have access to the file systems because it is a security concerns.
Therefore, what is the recommended approach for loading properties that are configurable outside of a deployable artifact if such a thing exists?
If not, what is currently being used or is in practice that is considered the recommended approach for this types of problem?
I am looking for a cross compatible answer (Google Chrome is awesome, I agree).
Data Driven Local Storage Pattern
Just came up with that!!
The idea is to load the configuration properties based on a naming over convention configuration where all properties are derived from the targeted hostname. That is, the hostname will derive a trusted endpoint and that endpoint will load the corresponding properties to the application. These application properties will contain information that is relative at runtime. The runtime information will be supplied to the integration parts which then communicate via property iteration on the bootstrapping start up.
To keep it simple, we'll just use two properties here:
This implementation is Ember JS specific but the general idea should be portable
I am currently narrowing the scope of this question to a specific technological perspective, that is Ember JS with the following remedy that is working properly for me and hope it will help any of you out there dealing with the same issue.
Ember.Application.initializer implementation in start up
initialize: function (container, application) {
var origin = window.location.origin;
var host = window.location.hostname;
var port = window.location.port;
var configurationEndPoint = '';
//local mode
if(host === 'localhost'){
//standalone using api stub on NODEJS
if(port === '8000'){
configurationEndPoint = '/api/local';
}//standalone UI app integrating with back end application on same machine, different port
else{
configurationEndPoint = '/services/env';
}
origin += configurationEndPoint;
}else{
throw Error('Unsupported Environment!!');
}
//load the configuration from a trusted resource and store it in local storage on start up
$.get(origin,
function( data ) {
//load all configurations as key value pairs and store in localStorage for access.
configuration = data.configuration;
for(var config in configuration){
debugger;
var objectProperty = localStorage + '.' + config.toString()
objectProperty = configuration[config];
}
}
);
}
Configurable Adapter
export default DS.RESTAdapter.extend({
host: localStorage.host,
namespace: localStorage.namespace
});
No later than yesterday morning i was tackling the same issue.
Basically, you have two options:
Use localStorage/indexedDB or any other client-side persistent storage. (But you have to put config there somehow).
Render your main template (the one that gets rendered always) with a hidden where you put config JSON.
Then in your app init code you get this config and use it. Plain and simple in theory, but lets get down to nasty practice (for second option).
First, client should get config before application loads. It is not easy sometimes. e.g. user should be logged in to see config. In my case i check if i can provide config on the first request, and if not redirect user to login page. This leads us to second limitation. Once you are ready to provide config, you have to reboot app completely so that configuration code run again (at least in Angular it is necessary, as you cannot access providers after the app bootstraps).
Another constraint, the second option is useless if you use static html and cannot change it somehow on server before sending to the client.
May be a better option would be to combine both variants. This should solve some problems for returning users, but first interaction will not be very pleasant anyway. I have not tried this yet.