Changing Brackets Shell Name results in WebSocket connection Error - javascript

I am trying to build an app using the Brackets shell. More specifically I am trying to build a custom code editor for a project so instead of starting from scratch I am modifying Brackets.
So far I have been able to work through all issues until I got to the Brackets Shell. I want to be able to install my app beside brackets, so it has to have a different name and separate preferences. I followed this guide on how to rename a Brackets Shell app. Here are the files I changed:
Gruntfile.js – change the build name
appshell/config.h – Change the app-name for windows and osx
appshell_config.gypi – Change the app-name as well
After running grunt setup and grunt build-mac my app launches and seems to work fine. I can change preferences in my app without affecting the original Brackets app (because they have different Application Support directories). I did not notice any issues until I opened the console where I saw the following error:
WebSocket connection to 'ws://localhost:50642/' failed: HTTP Authentication failed;
no valid credentials available NodeConnection.js:84
So I tried changing my apps name back to Brackets in all three files, and the issue goes away. My guess is somewhere in the code the app it is still trying to connect to the original app named Brackets. I'm guessing there is a 4th config file I need to change, but I am not familiar enough with Brackets to be able to locate that file. Without the connection Live Preview and eslint don't work.
I have tried inserting console.trace to try and reverse engineer how the Node Connection works between the Brackets Shell and the Brackets source code itself, but that didn't help much. Does anyone know how to change the name of Brackets Sheel without breaking NodeConnection at runtime?
I also tried searching for processes on port 50642 and the server is running.

You need to modify Node Core
Brackets Shell is hardwired to reject any call that is not from apps named Brackets. Open file brackets/appshell/node-core/Server.js. As of answering this question, you need to change line 205. Just in case that is different in the future you can find the commit I'm looking at here.
Here is what is causing the issue:
wsServer = new WebSocket.Server({
server: httpServer,
verifyClient : function (info, callback) {
// Accept connections originated from local system only
// Also do a loose check on user-agent to accept connection only from Brackets CEF shell
if (info.origin === "file://" && info.req.headers["user-agent"].indexOf(" Brackets") !== -1) {
callback(true);
} else {
// Reject the connection
callback(false);
}
}
});
The problem is info.req.headers["user-agent"].indexOf(" Brackets"). As you can see it rejects any connections that are not from Brackets. Rename brackets to whatever your app is called.
Make sure you format the name correctly
If your app has a space it in (eg. New Brackets), then you would remove the space when checking the user-agent. In this example you would check that the user-agent like so: info.req.headers["user-agent"].indexOf(" NewBrackets").
Congrats! You have built New Brackets.

Related

Strip file paths from errors sent to sentry

I'm setting up a self hosted Sentry for a CRA project I'm working on.
I'm having trouble getting sentry to understand where my artefacts are located (js and source maps). Uploading the javascript and source maps works as expected, and I can see them for each of my releases.
They are available at ~/main.XYZ.js and ~/main.XYZ.js.map respectively inside Sentry.
When my app at https://www.website.com/path/to/file/ records an error it send it to Sentry successfully. But the error provides a path for the js file that crashed, ~/path/to/file/main.XYZ.js. When Sentry then tries to find the error, it looks at ~/path/to/file/main.XYZ.js instead of ~/main.XYZ.js (where it actually is).
Is there any way to strip the path and just having Sentry looking at root directly?
Note: I was successful when setting the url prefix to ~/path/to/file. However that approach won't work in the real world later for me since that path is subject to change depending on what stage in development I'm in.

React/nextJS: How to debug different nodes of SSR react application?

I'm running a nextJS application, which is running SSR.
But as I do get the error:
Warning: Did not expect server HTML to contain a <div> in <div>.
So there seems to be a difference between the server-side and the client side nodes. How can I find those differences?
This is the repo of an example app:
https://github.com/jaqua/nextjs-app
Just run npm install and npm run dev
As comparing two html manually can be rather cumbersome depending on the size of your page, it's advised to first assess what could be wrong rather than brute-forcing. From my experience in 99% of the cases an SSR mismatch occurs when you either:
Included and rendered a Component which doesn't behave the same way on the client and the server (e.g they use global variables to determine where the code is being run and conditionally render elements based on that). For example there was a clipboard module that would only work on the client because it would use a variable of window.
Rendering of data fetched from an asynchronous source which is only present on either the server or the client. You need to make the same data available for both during the initial render.
If nothing comes out to mind after this, you need to proceed by elimination. In case the error occurs on every page, it is likely to be a result of a misconfiguration of the server. For example, are you doing your own renderToString? Double check you didn't add an extra nested div in there, the string should be right inside the element you mount React on.
If that is not the case, try to extract one by one the components you are rendering, and you should be able to narrow down pretty quickly which is causing your issue.
Also keep in mind that you would need to restart your server every-time you make a change (unless you have a nodemon or similar config reloading the server-side code when you modify your source) for it to be applied!
As a last resort, you could potentially make your own diff between the server response and the client first render.
1) Open your console from your site, and paste the following:
console.log(document.documentElement.innerHTML)
2) Click on the Copy button, and paste that in a client.html file
3) Now run in your terminal:
curl YOUR_URL > server.html
4) It's likely the server will return you a minified version of your html, so you need to indent it in order to make it match with your client html, use something like this for that purpose.
5) Once you've done this, you can now run the actual diff in your terminal:
diff server.html client.html
This will list you every part of the files that differ between each other.
You can ignore diffs related to Javascript as the indenting will most likely be bad anyway, but concentrate on the html ones, where you might be able to spot differences and infer what is going wrong.
In your case, your translation system is likely to be the root cause of the issue. I would advice to follow more standard practices rather than next-i18next which seem pretty new and more likely to have problems. Someone else apparently also has an issue with SSR, and to be honest stuff like this is quite scary.
I know it can look a bit troublesome to setup, but here is my own i18n config which can be required either on the server or the client provided you specify a global variable to determine one which environment you are (here __BROWSER__).
import i18n from 'i18next'
import LanguageDetector from 'i18next-browser-languagedetector'
import { reactI18nextModule } from 'react-i18next'
i18n
.use(require(__BROWSER__ ? 'i18next-xhr-backend' : 'i18next-node-fs-backend'))
.use(LanguageDetector)
.use(reactI18nextModule)
.init({
fallbackLng: 'en',
ns: ['translations'],
defaultNS: 'translations',
interpolation: {
escapeValue: false,
},
react: {
wait: true,
},
backend: {
loadPath: __BROWSER__
? '/locales/{{lng}}/{{ns}}.json'
: require('path').join(__dirname, '/locales/{{lng}}/{{ns}}.json'),
},
})
export default i18n
You simply need to use the middleware, serve the locales from your server so the client can load them from xhr and have the I18nextProvider require the i18n instance. The full SSR docs are here.
I would start by looking at the html that get's to the browser(network tab in chrome devtools), then react is probably rendering client side anyway, so you can see the current DOM after the client side render and compare (go to elements tab in chrome devtools -> right click the html element and select "copy> copy outterHTML")
If that fails, you can try adding breakpoints in the browser inside react itself:
function canHydrateInstance # ReactDOMHostConfig.js
https://github.com/facebook/react/blob/c954efa70f44a44be9c33c60c57f87bea6f40a10/packages/react-dom/src/client/ReactDOMHostConfig.js
possibly relevant links to same kind of issue:
React 16 warning "warning.js:36 Warning: Did not expect server HTML to contain a <div> in <div>."
https://github.com/zeit/next.js/issues/5367

Parse request.object.get("KEY") always returns undefined

I have a strange problem over here. I have a project built with Parse.com as a backend (using cloudcode to verify some things when a connection to the database is made). Everything works just as it should do.
But here comes the problem. Another developer reported to me that there is something wrong because he is getting 'undefined' every time he tries to call request.object.get('KEY')in CloudCode. This developer uses the exact same codebase as I do.
So I decided to have a look at it. While with my Parse account, every application works fine (even newly created ones), with the Parse account of the other developer, not a single new application we created seems to work with the exact same code. And it is getting even stranger - creating a completely new Parse account and a new application produces the same errors while my personal account and applications continue to work fine.
So what is the problem? We are using CloudCode, and here is sample code (in javascript) of a beforeSave method:
Parse.Cloud.beforeSave('Activity', function(request, response) {
var currentUser = request.user;
var objectUser = request.object.get('fromUser');
if(!currentUser || !objectUser) {
response.error('An Activity should have a valid fromUser.');
} else {
response.success();
}
});
And every time request.object.get('KEY') returns undefined, for every key I previously defined in the iOS code before uploading the PFObject.
Note that with my personal account everything is fine...
I have already seen this thread, however deleting ACL's didn't do the trick. request.object.get() stays undefined while request.useris defined for every tested Parse account except mine.
EDIT 1
I also had a look at the activity object just before it is uploaded, and there all the fields are properly set.
EDIT 2
After removing the cloud code completely, the objects are correctly being uploaded to Parse, with all the fields being the way they were set via the iOS client. So it seems that something is wrong with Parse's cloud code, but as soon as an object passes through cloud code, it looses all its fields.
Finally I was able to solve this. This is definitely a bug in Parse's Javascript SDK. I changed the Javascript SDK version in the global.json back to version "1.4.2" instead of "latest", uploaded this to the cloudcode folder and everything went back to normal.
You can also test other versions, maybe v1.5.0 is working too, but as soon as I found out v1.4.2 worked fine, I didn't try out more recent versions.
EDIT
So, I discovered, that Parse must have changed something in their command line tool. It seems that the global.json file isn't there anymore if you create your CloudCode folder with the most recent version of their command line tool. However, you can manually create it and upload the complete folder to your Parse app.
This is how my CloudCode folder looks like, just for example:
CloudCode folder contains three subfolders
• cloud - containing cloud code files
• config - containing the global.json file
• public - containing the index.html file
The global.json file contains these lines of code:
{
"global": {
"parseVersion": "1.4.2"
},
"applications": {
"YOUR_PARSE_APPS_NAME": {
"applicationId": "YOUR_APP_ID",
"masterKey": "YOUR_APP_MASTER_KEY"
},
"_default": {
"link": "YOUR_PARSE_APPS_NAME"
}
}
}

How to get my 404 page to show after upgrade of Sails.js to 0.10.x?

I've upgraded my Sails.js app to 0.10.x and now when I point my browser at a non-existent route such as http://localhost:1337/notfound instead of my views/404.jade being served up I just get a bit of JSON
{
"status": 404
}
I built a default new sails app sails new dummy --template=jade just to compare and contrast with what I have in my existing app, and the only obvious difference I see is that in dummy/config/ there is a file called http.js
I've copied that file over to my app but it's made no difference.
I've also ensured that the tasks in dummy/tasks are identical to my own app's tasks.
In dummy/config/routes.js it says
Finally, if those don't match either, the default 404 handler is triggered.
See config/404.js to adjust your app's 404 logic.
Which is obviously out of date as 0.10.x apps use the api/responses mechanisms.
So I am at rather a loss as to how to get this to work.
I am using 0.10.0-rc8 (and I have confirmed that this is the same in my dummy app as well as my actual app)
Well I've fixed this but I have no idea why it was happening.
To fix it I created a new project as per the method described in my question, but with the same name as my existing project, then, file-by-file, I painstakingly moved across everything specific to my app, taking care to leave in place anything new generated by sails.
Once I'd done that I ran my tests - all passed - and then sails lift and tried it, and yay, everything worked and I got my 404 error page back.
I committed my changes and then carefully picked through a comparison of what had changed.
Alas nothing at all stands out, so, while I have solved my problem, I have no idea what the original cause was.
From the section in the migration guide on "Custom Responses":
In v0.10, you can now generate your own custom server responses. See
here to learn how. Like before, there are a few that we automatically
create for you. Instead of generating myApp/config/500.js and other
.js responses in the config directory, they are now generated in
myApp/api/responses/. To migrate, you will need to create a new v0.10
project and copy the myApp/api/responses directory into your existing
app. You will then modify the appropriate .js file to reflect any
customization you made in your response logic files (500.js,etc).
Basically, v0.10.x gives you more freedom in how things like res.notFound, res.serverError and even res.ok() (success response) work, but you need to copy over the new api/responses folder to migrate.
I had the same issue but was using 0.9.x. Probably a better solution but I outputted a view instead of JSON in all cases.
Update config/404.js to replace res.json() with res.view():
if (err) {
//return res.json(result, result.status); }
return res.view('404') // <-- output the 404 view instead
}
Then, just make sure in your routes.js file it will redirect the /404. Place the following at the bottom of your routes.js file:
'/*': {
view: '*'
},

How can I edit on my server files without restarting nodejs when i want to see the changes?

I'm trying to setup my own nodejs server, but I'm having a problem. I can't figure out how to see changes to my application without restarting it. Is there a way to edit the application and see changes live with node.js?
Nodules is a module loader for Node that handles auto-reloading of modules without restarting the server (since that is what you were asking about):
http://github.com/kriszyp/nodules
Nodules does intelligent dependency tracking so the appropriate module factories are re-executed to preserve correct references when modules are reloaded without requiring a full restart.
Check out Node-Supervisor. You can give it a collection of files to watch for changes, and it restarts your server if any of them change. It also restarts it if it crashes for some other reason.
"Hot-swapping" code is not enabled in NodeJS because it is so easy to accidentally end up with memory leaks or multiple copies of objects that aren't being garbage collected. Node is about making your programs accidentally fast, not accidentally leaky.
EDIT, 7 years after the fact: Disclaimer, I wrote node-supervisor, but had handed the project off to another maintainer before writing this answer.
if you would like to reload a module without restarting the node process, you can do this by the help of the watchFile function in fs module and cache clearing feature of require:
Lets say you loaded a module with a simple require:
var my_module = require('./my_module');
In order to watch that file and reload when updated add the following to a convenient place in your code.
fs.watchFile(require.resolve('./my_module'), function () {
console.log("Module changed, reloading...");
delete require.cache[require.resolve('./my_module')]
my_module = require('./my_module');
});
If your module is required in multiple files this operation will not affect other assignments, so keeping module in a global variable and using it where it is needed from global rather than requiring several times is an option. So the code above will be like this:
global.my_module = require ('./my_module');
//..
fs.watchFile(require.resolve('./my_module'), function () {
console.log("Module changed, reloading...");
delete require.cache[require.resolve('./my_module')]
global.my_module = require('./my_module');
});
Use this:
https://github.com/remy/nodemon
Just run your app like this: nodemon yourApp.js
There should be some emphasis on what's happening, instead of just shotgunning modules at the OP. Also, we don't know that the files he is editing are all JS modules or that they are all using the "require" call. Take the following scenarios with a grain of salt, they are only meant to describe what is happening so you know how to work with it.
Your code has already been loaded and the server is running with it
SOLUTION You need to have a way to tell the server what code has changed so that it can reload it. You could have an endpoint set up to receive a signal, a command on the command line or a request through tcp/http that will tell it what file changed and the endpoint will reload it.
//using Express
var fs = require('fs');
app.get('reload/:file', function (req, res) {
fs.readfile(req.params.file, function (err, buffer) {
//do stuff...
});
});
Your code may have "require" calls in it which loads and caches modules
SOLUTION since these modules are cached by require, following the previous solution, you would need a line in your endpoint to delete that reference
var moduleName = req.params.file;
delete require.cache[moduleName];
require('./' + moduleName);
There's a lot of caveats to get into behind all of this, but hopefully you have a better idea of what's happening and why.
What's “Live Coding”?
In essence, it's a way to alter the program while it runs, without
restarting it. The goal, however, is to end up with a program that
works properly when we (re)start it. To be useful, it helps to have an
editor that can be customized to send code to the server.
Take a look: http://lisperator.net/blog/livenode-live-code-your-nodejs-application/
You can also use the tool PM2. Which is a advanced production process tool for node js.
http://pm2.keymetrics.io/
I think node-inspector is your best bet.
Similar to how you can Live Edit Client side JS code in Chrome Dev tools, this utilizes the Chrome (Blink) Dev Tools Interface to provide live code editing.
https://github.com/node-inspector/node-inspector/wiki/LiveEdit
A simple direct solution with reference to all answers available here:
Node documentation says that fs.watch is more efficient than fs.watchFile & it can watch an entire folder.
(I just started using this, so not really sure whether there are any drawbacks)
fs.watch("lib", (event_type, file_name) => {
console.log("Deleting Require cache for " + file_name);
delete require.cache[ require.resolve("./lib/" + file_name)];
});

Categories

Resources