Load an external javascript file as a node process [duplicate] - javascript

This question already has answers here:
Load and execute external js file in node.js with access to local variables?
(6 answers)
Closed 4 years ago.
I'm writing integration tests for purescript FFI bindings with google's API map.
The problem Google's code is meant to be loaded externally with a <script> tag in browser not downloaded and run in a node process. What I've got now will download the relevant file as gmaps.js but I don't know what to do to actually run the file.
exports.setupApiMap = function() {
require('dotenv').config();
const apiKey = process.env.MAPS_API_KEY;
const gmaps = "https://maps.googleapis.com/maps/api/js?key=" + apiKey;
require('download')(gmaps, "gmaps.js");
// what now???
return;
};
For my unit tests, I must later be able to run new google.maps.Marker(...). Then I can check that my setTitle, getTitle etc. bindings are working correctly.

This is a duplicate question of this one. The correct code was.
exports.setupApiMap = async function() {
require('dotenv').config();
const apiKey = process.env.MAPS_API_KEY;
const gmaps = "https://maps.googleapis.com/maps/api/js?key=" + apiKey;
await require('download')(gmaps, __dirname);
const google = require('./js');
return;
};
The key was to download to __dirname before using require. That said my specific use cases didn't work since google's API map code just can't be run in a node process. It must be run in a browser.

Related

Does 'onDidChangeTextDocument()' promise in VScode extension depend on the user's active window to start listening?

I'm a new developer and this is my first Stack Overflow post. I've tried to stick to the format as best as possible. It's a difficult issue for me to explain, so please let me know if there's any problems with this post!
Problem
I'm working on a vscode extension specifically built for Next.js applications and running into issues on an event listener for the onDidChangeText() method. I'm looking to capture data from a JSON file that will always be located in the root of the project (this is automatically generated/updated on each refresh of the test node server for the Next.js app).
Expected Results
The extension is able to look for updates on the file using onDidChangeText(). However, the issue I'm facing is on the initial run of the application. In order for the extension to start listening for changes to the JSON file, the user has to be in the JSON file. It's supposed to work no matter what file the user has opened in vscode. After the user visits the JSON file while the extension is on, it begins to work from every file in the Next.js project folder.
Reproducing this issue is difficult because it requires an extension, npm package, and a next.js demo app, but the general steps are below. If needed, I can provide code for the rest.
1. Start debug session
2. Open Next.js application
3. Run application in node dev
4. Do not open the root JSON file
What I've Tried
Console logs show we are not entering the onDidTextDocumentChange() block until the user opens the root JSON file.
File path to the root folder is correctly generated at all times, and prior to the promise being reached.
Is this potentially an async issue? Or is the method somehow dependent on the Active Window of the user to start looking for changes to that document?
Since the file is both created and updated automatically, we've tested for both, and neither are working until the user opens the root JSON file in their vscode.
Relevant code snippet (this will not work alone but I can provide the rest of the code if necessary. ).
export async function activate(context: vscode.ExtensionContext) {
console.log('Congratulations, your extension "Next Step" is now active!');
setupExtension();
const output = vscode.window.createOutputChannel('METRICS');
// this is getting the application's root folder filepath string from its uri
if (!vscode.workspace.workspaceFolders) {
return;
}
const rootFolderPath = vscode.workspace.workspaceFolders[0].uri.path;
// const vscode.workspace.workspaceFolders: readonly vscode.WorkspaceFolder[] | undefined;
// this gives us the fileName - we join the root folder URI with the file we are looking for, which is metrics.json
const fileName = path.join(rootFolderPath, '/metrics.json');
const generateMetrics = vscode.commands.registerCommand(
'extension.generateMetrics',
async () => {
console.log('Succesfully entered registerCommand');
toggle = true;
vscode.workspace.onDidChangeTextDocument(async (e) => {
if (toggle) {
console.log('Succesfully entered onDidChangeTextDocument');
if (e.document.uri.path === fileName) {
// name the command to be called on any file in the application
// this parses our fileName to an URI - we need to do this for when we run openTextDocument below
const fileUri = vscode.Uri.parse(fileName);
// open the file at the Uri path and get the text
const metricData = await vscode.workspace
.openTextDocument(fileUri)
.then((document) => {
return document.getText();
});
}
}
});
});
}
Solved this by adding an "openTextDocument" call inside the "registerCommand" block outside of the "onDidChangeTextDocument" function. This made the extension aware of the 'metrics.json' file without it being open in the user's IDE.

Open a directory of images into separate layers using Adobe extension

I am developing an Adobe extension, from within the extension I want to load a directory of images into separate layers within a document. I am completely impartial to how this is done - so if there is a better approach, please share it with me. My current working method involves using the open() method which opens a file in a new document, then duplicate the layer of the new document into the original document. An example of this can be seen below.
// open new document
var originalDoc = app.activeDocument;
var doc = open( new File( filePath ) );
// duplicate to original document
var layer = doc.activeLayer;
var newLayer = layer.duplicate(originalDoc, ElementPlacement.PLACEATBEGINNING);
// close new document
doc.close(SaveOptions.DONOTSAVECHANGES);
This method is extraordinarily slow, especially for large images. After doing some Googling I discovered that Photoshop has a built-in method for creating an image stack. This feature uses a .jsx script itself and it can be found on GitHub. Looking around online I found a few people trying to load a folders contents as layers, perfect. The main code I was interested in is below.
var folder = new Folder('~/Desktop/MyFolder');
function runLoadStack(folderPath) {
var loadLayersFromScript = true;
// #include 'Load Files into Stack.jsx'
var fList = folder.getFiles('*.png')
var aFlag = true;
loadLayers.intoStack(fList, aFlag);
}
runLoadStack(folder)
I immediately noticed the #include method of importing the stack methods, I can not find any official documentation for this (also not friendly with minification). Also, if the script is not placed with the same directory as Load Files into Stack.jsx it will throw the error Unable to open file: anonymous. And even after solving all of these issues when I run the .jsx script from within my extension using $.evalFile() I am having the same error as if the script is not in the correct directory: Unable to open file: anonymous. Error is being thrown on line 762 of an imported jsx.
Any help resolving the error I am experiencing or simply on how to load an array of image paths into layers (faster method) will be greatly appreciated!
Here is the code I am using within my extension:
var loadLayersFromScript = true;
var strPresets = localize("$$$/ApplicationPresetsFolder/Presets=Presets");
var strScripts = localize("$$$/PSBI/Automate/ImageProcessor/Photoshop/Scripts=Scripts");
var jsxFilePath = app.path + "/" + strPresets + "/" + strScripts + "/Load Files into Stack.jsx";
$.evalFile( new File( jsxFilePath ) );
loadLayers.intoStack( new Folder("/c/Users/Me/teststack").getFiles(), true );
Photoshop's inbuilt scripts has a script to do this here's the github link
https://github.com/ES-Collection/Photoshop-Scripts/blob/master/Import%20Folder%20As%20Layers.jsx
use this script inside your CEP extension

Get the rendered HTML from a fetch in javascript [duplicate]

This question already has answers here:
How can I dump the entire Web DOM in its current state in Chrome?
(4 answers)
Closed 3 years ago.
I’m trying to fetch a table from a site that needs to be rendered. That causes my fetched data to be incomplete. The body is empty as the scripts hasn't been run yet I guess.
Initially I wanted to fetch everything in the browser but I can’t do that since the CORS header isn't set and I don’t have access to the server.
Then I tried a server approach using node.js together with node-fetch and JSDom. I read the documentation and found the option {pretendToBeVisual: true } but that didn't change anything. I have a simple code posted below:
const fetch = require('node-fetch');
const jsdom = require("jsdom");
const { JSDOM } = jsdom;
let tableHTML = fetch('https://www.travsport.se/uppfodare/visa/200336/starter')
.then(res => res.text())
.then(body => {
console.log(body)
const dom = new JSDOM(body, {pretendToBeVisual: true })
return dom.window.document.querySelector('.sportinfo_tab table').innerHTML
})
.then(table => console.log(table))
I expect the output to be the html of the table but as of now I only get the metadata and scripts in the response making the code crash when extracting innerHTML.
Why not use google-chrome headless ?
I think the site you quote does not work for --dump-dom, but you can activate --remote-debugging-port=9222 and do whatever you want like said in https://developers.google.com/web/updates/2017/04/headless-chrome
Another useful reference:
How can I dump the entire Web DOM in its current state in Chrome?

How do I load antlr/index into ace worker js file?

I am following the instruction given here to integrate antlr4 with ace editor and I have trouble at the step var antlr4 = require('antlr4/index');. The author mentions that here we should use require for nodejs. However, ACE has another require that may cause problems. Thus he loaded another script for nodejs require and load antlr4/index with require in that script.
I tried that one, too. But it always cannot find the script. From the console of my browser, I can see the path it loads the script is:
localhost:4200/./antlr4/index.js and it fails to load it.
I am using Angular 7, and the structure of my project is as follows:
Also, when loading some local javascript file using importScripts, I always fails by giving the local path, however, giving URL from CDN will always work. But importScripts should support local file importing.
Where should I make changes and what else methods should I try?
Here are some of my code:
var ace_require = require;
window.require = undefined;
var Honey = { 'requirePath': ['..'] };
//importScript{"require.js"}
//the script can't be imported through importSctipt so I pasted the
//whole script file under...(not shown here)
var antlr4_require = window.require; // antlr4_require:antlr using nodejs require;
window.require = require = ace_require; // require:ace using its own require
var antlr4, LPMLNLexer, LPMLNParser;
try {
window.require = antlr4_require;
antlr4 = antlr4_require('antlr4/index');
//the browser stuck here reporting error...
} finally {
window.require = ace_require;
}

FirebaseStorage: Deleting a folder and all its contents using node.js [duplicate]

This question already has answers here:
Delete Folder from Firebase Storage using Google Cloud Storage
(2 answers)
Closed 5 years ago.
I want to delete a Firebase Storage folder and all its contents using node.js / Firebase Admin SDK but I'm not able to.
A similar question was asked in the google group below about a year ago and I'm wondering if there is a solution now:
https://groups.google.com/forum/#!topic/firebase-talk/aG7GSR7kVtw
I am able to delete a single file using node.js example below:
https://mzmuse.com/blog/how-to-upload-to-firebase-storage-in-node
But I'm not able to delete a folder and all it's contents.
Any ideas? Am I missing something?
Here's the code I'm using
const keyFilename="path/to/my/private.json";
const projectId = "myprojectid";
const bucketName = `${projectId}.appspot.com`;
const gcs = require('#google-cloud/storage')({
projectId,
keyFilename
});
const bucket = gcs.bucket(bucketName);
THIS WORKS FINE - Deleting a single file
const deleteFile = 'users/user1/folder1/IMG_1.JPG'
const gcFile = bucket.file(deleteFile);
gcFile.delete((err,res)=>console.log(err||res));
THIS DOES NOT WORK - Deleting the folder and contents
const deleteFolder = 'users/user1/'
const gcFolder = bucket.file(deleteFolder);
gcFolder.delete((err,res)=>console.log(err||res));
--
THIS IS NOT A DUPE AS MARKED BY SOME MEMBERS
My question is specific to node.js and the answer given is for Java.
UPDATE
I found this a page in google cloud site where they show a way to delete all files under a directory (folder)
https://googlecloudplatform.github.io/google-cloud-node/#/docs/google-cloud/0.56.0/storage/bucket?method=deleteFiles
bucket.deleteFiles({
prefix: 'images/'
}, function(err) {
if (!err) {
// All files in the `images` directory have been deleted.
}
});
But I'm still not able to delete the folder itself
From this post:
You can delete only the whole bucket, but you cannot delete a folder
in a bucket.
Use GcsService to delete files or folders.
String bucketName = "bucketname.appspot.com";
GcsService gcsService = GcsServiceFactory.createGcsService(RetryParams.getDefaultInstance());
gcsService.delete(new GcsFilename(bucketName, "test"));

Categories

Resources