I've installed Jetty using homebrew on osx. Jetty is in the usual folder:
/usr/local/Cellar/jetty/9.4.5.v20170502
However, we are not going to use this for anything related to Java. It's just our requirement that the web server we will have in production is Websphere. Our code is pure html and some JS. Only static files. All the tutorials on the web require modifying XML files and setting up complex folder structures for java apps. We do not need any of this. All we want is to put html/js files in some specific place.
Is there an easy way to do this? Do we have to create a start.jar for this simple task? How do we create a WAR package via Maven or something else? Would appreciate any simple pointer to getting this dev environment set up. Thank you!
It seems you might be overthinking this a bit. start.jar is merely the JAR file that starts up the sever. You can easily host static content out of Jetty. I am unsure of your specific requirements, but for the sake of an example, I will just assume you have a simple webapp with a page and some resources. Your implementation might look like this.
Inside the Jetty 9.4.5.v20170502 directory - Also called Jetty_Home (you can learn about Jetty_Home vs Jetty_Base here), you'll find the following directories:
bin demo-base etc lib logs modules resources webapps
These are the standard of truth and should not be modified (though demo-base can be removed entirely if desired). You will want to create your own Jetty_Base directory to host your content out of, let's call it stackoverflow for this example. It is worth noting that a Jetty_Base can exist anywhere, I am just keeping it here for ease of this example.
Inside the stackoverflow directory you'll find nothing to start with, so we will need to enable some modules to populate our Jetty_Base. Now, I have no idea what all you plan to do with this webserver, so I will keep it minimal - enough to host the content. From inside this directory run:
> java -jar ../start.jar --create-startd
MKDIR : ${jetty.base}/start.d
INFO : Base directory was modified
This will create a start.d directory for all of our module files to live in, which makes modifying and configuring things a breeze (though for the sake of this example everything will be left default). Now we need to add the modules which will enable the functionality on the server. You can view a whole list of available modules by running:
> java -jar ../start.jar --list-modules
I am not going to paste the whole list here but instead enable the following:
> java -jar ../start.jar --add-to-start=server,client,deploy,http,webapp,jsp
INFO : webapp initialized in ${jetty.base}/start.d/webapp.ini
INFO : server initialized in ${jetty.base}/start.d/server.ini
INFO : security transitively enabled
INFO : apache-jsp transitively enabled
INFO : servlet transitively enabled
INFO : jsp initialized in ${jetty.base}/start.d/jsp.ini
INFO : jndi transitively enabled
INFO : client initialized in ${jetty.base}/start.d/client.ini
INFO : http initialized in ${jetty.base}/start.d/http.ini
INFO : annotations transitively enabled
INFO : plus transitively enabled
INFO : deploy initialized in ${jetty.base}/start.d/deploy.ini
MKDIR : ${jetty.base}/webapps
INFO : Base directory was modified
This will enable all the modules I defined (server,client,deploy,http,webapp,jsp) as well as any dependencies required for them to operate and any required folders (such as webapps).
Now, I created a very small webapp named example. I've gone ahead and moved it into the webapps directory inside our stackoverflow Jetty_Base:
> tree
.
├── start.d
│ ├── client.ini
│ ├── deploy.ini
│ ├── http.ini
│ ├── jsp.ini
│ ├── server.ini
│ └── webapp.ini
└── webapps
├── example
│ ├── images
│ │ ├── jetty-header.jpg
│ │ └── webtide_logo.jpg
│ ├── index.html
│ └── jetty.css
└── example.xml
I have a context xml file I created for this webapp named example.xml, which is only serving static content, and it is very simple:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE Configure PUBLIC "-//Jetty//Configure//EN" "http://www.eclipse.org/jetty/configure_9_3.dtd">
<Configure id="exampleApp" class="org.eclipse.jetty.webapp.WebAppContext">
<Set name="contextPath">/example</Set>
<Set name="war"><Property name="jetty.webapps" default="."/>/example</Set>
</Configure>
This configures the webapp so that when I run the server I can navigate to, in this case, localhost:8080/example and see my content. Now all I have to do is run my server. From inside the stackoverflow directory:
java -jar ../start.jar
2017-06-08 10:36:32.300:INFO::main: Logging initialized #427ms to org.eclipse.jetty.util.log.StdErrLog
2017-06-08 10:36:32.477:INFO:oejs.Server:main: jetty-9.4.5.v20170502
2017-06-08 10:36:32.494:INFO:oejdp.ScanningAppProvider:main: Deployment monitor [file:///Users/example/installs/repository/jetty-distribution-9.4.5.v20170502/stackoverflow/webapps/] at interval 1
2017-06-08 10:36:32.633:INFO:oeja.AnnotationConfiguration:main: Scanning elapsed time=26ms
2017-06-08 10:36:32.668:INFO:oejs.session:main: DefaultSessionIdManager workerName=node0
2017-06-08 10:36:32.668:INFO:oejs.session:main: No SessionScavenger set, using defaults
2017-06-08 10:36:32.669:INFO:oejs.session:main: Scavenging every 600000ms
2017-06-08 10:36:32.692:INFO:oejsh.ContextHandler:main: Started o.e.j.w.WebAppContext#6e06451e{/example,file:///Users/example/installs/repository/jetty-distribution-9.4.5.v20170502/stackoverflow/webapps/example/,AVAILABLE}{/example}
2017-06-08 10:36:32.713:INFO:oejs.AbstractConnector:main: Started ServerConnector#21a947fe{HTTP/1.1,[http/1.1]}{0.0.0.0:8080}
2017-06-08 10:36:32.714:INFO:oejs.Server:main: Started #840ms
And now you have a server up and running and content being served. I hope this helps. It can seem overwhelming at first, but it isn't so bad once your get your hands dirty. I recommend reading up more at the official documentation.
Related
Is there a recommended way to enforce deployment order via specific apps using TurboRepo? I know you can specify that all child dependents run first, but that results in undesired behavior in my scenario.
Here is an example of my file structure:
├── apps
│ ├── backend
│ └── web
├── packages
│ ├── assets
│ ├── config
│ ├── design-system
│ ├── hooks
│ └── utils
And here is the command I'm running to deploy:
yarn turbo run deploy:ci --filter=...[origin/main] --dry-run
In my scenario, I'd like my apps/backend to deploy before apps/web because web relies on output from the backend. I thought about using the following turbo.json:
{
"$schema": "https://turborepo.org/schema.json",
"baseBranch": "origin/main",
"pipeline": {
"deploy:ci": {
"dependsOn": ["^deploy:ci"],
"outputs": [".sst/**", ".build/**", ".expo/**"]
}
}
}
However, while this works if I add backend as a devDependency of web, it also results in backend always being rebuilt (even when none of its dependencies have changed). This is because if I change packages/hooks (which backend does not rely on), it will try to deploy packages/utils because hooks uses the utils package. This waterfalls and causes it to try to deploy backend because backend uses utils.
I'd also like to note that only the apps/* contain deploy:ci methods, so there is really no need for it to try to deploy changes to any package/* dependencies.
My end goal would look like the following:
Change packages/hooks
Detect change in packages/hooks and trigger deploy:ci for apps/web (which has hooks as a dependency)
Or
Changes packages/utils
Detect change in packages/utils and try to deploy both apps/backend and apps/web because they both rely on utils
I've tried replacing
"dependsOn": ["^deploy:ci"],
with
"dependsOn": [],
and this does result in only the correct packages being rebuilt, but the deploy order is willy-nilly. Ideally, I'd have this latter behavior while still enforcing backend always goes before web.
I have recently been working in p5.js in the Atom ide and have been trying to use a local p5.js library downloaded from their website. As a minimal example, I have tried to execute a basic sketch.js file using the atom online web server package (atom-live-server-plus) and by referring to the p5.js library which is positioned in the parent folder of the atom project. This way I only need one copy of the p5.js library and all my numerous p5.js projects can refer to the same library without it needing to be repeated.
The issue: I have been unable to execute the code since an issue occurs when I specify the p5.js library location in the .html file using '..' syntax. Example:
<script src="..\libraries\p5.js"></script>.
Through tests I have managed identify this as an issue by finding two alternative methods to get the sketch.js file to work:
A) insert the p5.js library file into the root directory, ie the project folder (test_1) and call it directly in the .html file. Example:
<script src="p5.js"></script>
B) Call an online http: website version of the p5.js library in the.html file. Example:
<script src=https://Unpkg.com/p5></script>
In both cases the sketch.js file loads on the online server as expected.
However, I would like to understand why the syntax '..' to describe the path to a parent folder does not work especially as I have recently downloaded the Generative Design book code examples (https://github.com/generative-design/Code-Package-p5.js) which require an extensive set of libraries to run each code's own sketch.js file and therefore it is in my best interests to use the local library folder and keep it in the parent folder rather than copy it into each individual code project's directory. I, therefore, need to find a way to refer to a parent folder library in the .html file. Please could you provide any insight into my issue?
Minimal Example Code Structure:
p5_work
|
+-- libraries
| |
| +-- p5.sound.js
| |
| +-- p5.js
|
+-- test_1 (Atom project)
|
+-- index.html
|
+-- sketch.js
index.html Code:
<!DOCTYPE html>
<html lang="">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>p5.js test_1 example</title>
<style>
body {
padding: 0;
margin: 0;
}
</style>
<script src="..\libraries\p5.js"></script>
<script src="sketch.js"></script>
</head>
<body>
<main>
</main>
</body>
</html>
sketch.js Code:
function setup() {
createCanvas(720, 400);
background(0);
}
function draw() {
fill(255,0,0)
ellipse(50, 50, 80, 80);
}
While I am not that qualified to answer any question regard to JavaScript and P5. I tried to build a repository similar to the one I am doing always with Python then I faced this problem. Which almost made me quit the P5 learning project. After reading many and many articles I found the following:
It probably won't work. If your server setup is normal only public and
the folders below it are accessible from the client-side. From the
client's point of view, the public is the root, there is nothing above
that.
As for importing bootstrap, see if you can browse what node is making
available in your browser. It's probably hosting the scripts you need
to reference on the client-side at another URL. You might also find
the URL in the bower documentation/examples.
You could solve this by copying or symlinking the scripts into the
public directory, but that would be a bit hackish, save it for if you
get completely fed up with finding the intended way.
See more details here:
Referencing a parent directory in HTML
The best solution I found so far is based on the p5-manager npm library.
Quick Start
$ npm install -g p5-manager
There are several use case of p5-manager, Before going further, choose the one best describe your requirements and go ahead.
Step 1: Initialize a new collection
$ p5 new my_collection
By running this command, it will create a collection directory and some p5 libraries to it. See the output log:
# create : my_collection
# create : my_collection/libraries
# create : my_collection/libraries/p5.js
# ...
Step 2: Generate a p5 project
$ cd my_collection
$ p5 generate my_project
# or...
$ p5 g my_project
This will generate a p5 project folder with default templates in it. (Make sure you are running this command in a collection directory.)
# create : my_project
# create : my_project/sketch.js
# create : my_project/index.html
Step 3: Start the server and have fun!
$ p5 server
# or...
$ p5 s
Now edit your sketch.js and go to localhost:5555, then p5-manager will do the rest. The server supports live reload as default. (Notice: You should run the p5 server in a collection directory, instead of a project directory.)
More details are here:
https://www.npmjs.com/package/p5-manager
Now, I can run as many as I want of sketches that are referred to the same index.html. My previous setup was about creating multiple downloads of the P5 library for each sketch., but now I overcome this problem and the structure is ready for my P5 learning journey. I hope you will find this post helfpul.
└── Library/
│ ├──── index.html
│ └──── sketch.js
├── LICENSE
├── README.md
└── database/
└── develop/
└── libraries/
│ ├──── p5.dom.js
│ ├──── p5.js
│ └──── p5.sound.js
└── node_modules/
└── notes/
├── package-lock.json
├── package.json
└── services/
└── src/
│ ├──── index.html
│ └──── sketch.js
└── testing/
└── typescript
Last Note I promise: This issues appeared in many places, and still not solved by so many.
node.js require from parent folder
Referencing a parent directory in HTML
https://discourse.processing.org/t/p5-js-not-found-in-parent-directory/21981
The file is called p5.min.js not p5.js.
I really like the folder structure as can be seen here when dealing with a React frontend and a some backend with express:
root
├── backend
| ├── node_modules
| ├── public
| ├── src
│ │ └── Server.ts
| ├── package.json
| └── tsconfig.json
├── frontend (created using create-react-app)
| ├── node_modules
| ├── public
| ├── src
│ │ └── Index.js
| ├── package.json
| └── tsconfig.json
I think that having separate packages with individual node_modules is reasonable since the frontend and backend are basically completely different things, e. g. they need different node modules. Also, this modular approach is visually appealing to me and the repository looks tidy.
However, I encounter a problem with this structure when I need to share content between the frontend and the backend. I added a shared folder under the root-of-project which contains its own project with its own tsconfig.json, package.json and so on. This approach is a mix of the approaches here and here. For the backend, this works totally fine: having set up the tsconfig.json appropriately (using TypeScript Project References and aliased imports), I can reference the file root/shared/src/myFile.ts like this:
import { myFunction } from #shared/myFile;
I created the React frontend using create-react-app. It's ok for me that alias imports don't work, so I would have to use (inside the src folder in frontend):
import { myFunction } from '../../shared/src/myFile';
Sadly, these imports from outside the src directory are not supported by create-react-app and I don't want to use eject since I have no experience with webpack and don't want to maintain all the configuration files on my own (that's why I used create-react-app in the first place).
I know I can move the shared content to the frontend's src directory. But this would mean, I had to add the tags needed for using Project References in TypeScript, e. g. setting composite to true, in the frontend's tsconfig.json which seems odd to me and feels more like a hack. I'd like to have a separate npm project with my shared content.
Since create-react-app does not inherently support imports from outside the src directory, I thought that maybe I'm getting the big picture wrong. Isn't the folder structure I use right now a valid way of how to setup a React project with a backend? What mechanism does create-react-app provide to link files between the frontend and the backend? I could also think of having a root project with a src folder and inside of that the two folders backend and frontend. But this means, that we'd have one shared node_modules folder in root.
It's my first project with React and I'd love to get to know some best practicese for this kind of architectural problem. Some links to trustful resources where project structures for full-stack React development are explained would be really helpful. Thank you 😊
It's perfectly reasonable to want to share code between your front and back end. It's one of the reasons to code in javascript instead of Ruby or PHP.
You can accomplish exactly what you want by using yarn instead of npm and yarn workspaces: https://yarnpkg.com/lang/en/docs/workspaces/. At the top level you set up three modules/packages in your package.json (make sure you name the workspaces correctly in their respective package.json files):
"workspaces": {
"packages": [
"backend",
"frontend",
"shared"
]
},
Once you do, you can import shared code in your CRA app or your back end simply like this:
import { myFunction } from 'shared/src/myFile';
The drawback is that you can't import react components from the shared directory into frontend as long as you are using CRA. This won't affect you now since you only have one react app. Should you need to share react components among multiple projects, look into some on the suggestions above like bit.dev.
ADDENDUM!!! It's now possible to use CRA and yarn workspaces to share React code, if you replace CRA with CRACO. All you do is create another workspace with the shared react code. Then create a symbolic link in each module where you want to access it:
root
├── fontend-one
| ├── symbolic link to frontend-shared
├── frontend-two
| ├── symbolic link to frontend-shared
├── frontend-shared
Each of the CRA front end modules modules also requires a craco.config.js file where you tell web-pack not to follow symbolic links:
module.exports = {
// ...
webpack: {
configure: {
resolve: {
symlinks: false
}
}
}
};
You import from shared-frontend normally:
import { myFunction } from 'shared-frontend/src/myFile';
It's a pretty lo-fi solution but has proven robust for the year we've been using it.
Architecture is a tricky one, everyone has a different opinion and every option has pro and cons.
Personally I believe its best to separate the backend and frontend into sperate projects and keep them that way. Now as JavaScript/React/Node encourage component-based approaches a really nice way of sharing code between them is Bit.dev.
https://bit.dev
I am currently using it to share components and functions between three web apps and a few Node microservices.
A good structure for React app can be found here, this one works well and scales nicely:
https://hackernoon.com/fractal-a-react-app-structure-for-infinite-scale-4dab943092af
As for Express, there are so many ways to structure the project but personally recommend a folder for your Routes, a folder for your Controllers, this is where the logic for Routes live. Then go from there. Check this link out:
https://www.freecodecamp.org/news/how-to-write-a-production-ready-node-and-express-app-f214f0b17d8c/
Depending on what your building you may not even need a full backend you can check out the JAMStack here for more info:
https://jamstack.org
I would consider separating them though as the project scales it makes it much easier to manage. You can release your front end on something like Netlify and then use something like AWS or Azure to host your Node/Express server.
Having separate sub-projects for frontend and backend makes perfect sense due to vastly different dependencies. Mixing it up increases disk space consumption in production deployments and goes against security guidelines. Your folder structure is reasonable (except for public/ subfolders I'm unsure about, maybe I'm missing something).
The approach import { myFunction } from #shared/myFile; is fine. Just don't use CRA.
For a project with a single frontend and a single backend there is no need for a shared\ top-level folder because the frontend is the only source of 'UI truth' (e.g. the source of types and components related to UI) that is consumed by the frontend and the backend is the only source of 'API truth' consumed by both frontend and backend. With this arrangement only #backend/api_shared_stuff needs to be shared.
Some links to trustful resources where project structures for full-stack React development are explained would be really helpful. On the one hand, usually project authors have to explain plenty of other things and on the other hand keep the documentation (typically a README) reasonably concise. You may find that providing a justification/explanation why the subdirectory structure is this and not that was not the top priority.
I'm currently working on developing some set of codes to display all blobs inside specified Azure Container using web front-end. I'm expecting the final output to be something like this:
I started by creating a dummy storage account and populates it with some dummy files for me to play around with.
https://alicebob.blob.core.windows.net/documents
├── docx
│ ├── 201801_Discussion.docx
│ ├── 201802_Discussion.docx
├── xlsx
│ ├── 201801_Summary.xlsx
│ ├── 201802_Summary.xlsx
│ ├── 201803_Summary.xlsx
├── 201801_Review.pdf
├── 201802_Review.pdf
├── 201803_Review.pdf
To develop file listing function, I'm using Azure Storage JavaScript client library from here and put all the necessary codes (.html and .js files) in Azure Static website $web container and set index.html as Index document name and Error document path in the Static website configuration.
https://alicebob.z23.web.core.windows.net/
├── azure-storage.blob.min.js
├── azure-storage.common.min.js
├── index.html
The problem is that the function to do the listing is only either listBlobsSegmentedWithPrefix or listBlobDirectoriesSegmentedWithPrefix. So, in my case, I assume it wouldn't work straightforwardly to list all the blobs and directories in a well-structured / tree format.
My current approach is that I trick the code to keep using listBlobDirectoriesSegmentedWithPrefix until there is no more directory to list inside, then continue to list using listBlobsSegmentedWithPrefix
So far I'm quite satisfied that my code can list all the Blobs at the leaf-level and also list all the directories if it isn't on the leaf-level. You can take a look at the blob listing here and feel free to go for 'View Source' to see the codes I built so far.
The only problem that I face is that this set of code fails to list the Blobs if it wasn't on the leaf-level. For example, it fails to list these blobs on alicebob storage account:
├── 201801_Review.pdf
├── 201802_Review.pdf
├── 201803_Review.pdf
This is an expected issue as I'm not running listBlobsSegmentedWithPrefix if it isn't on the leaf-level. The reason is that it will produces the output with something like this which isn't what I want:
├── docx/201801_Discussion.docx
├── docx/201802_Discussion.docx
├── xlsx/201801_Summary.xlsx
├── xlsx/201802_Summary.xlsx
├── xlsx/201803_Summary.xlsx
├── 201801_Review.pdf
├── 201802_Review.pdf
├── 201803_Review.pdf
Any suggestion on how to overcome this issue? The real implementation would involves a huge amount of data so I think a simple if-then-else wouldn't be efficient on this case.
sorry for the long description but I just want to describe my problem as clear as possible :)
There's an option called delimiter when listing blobs. Let's get down to code.
blobService.listBlobsSegmentedWithPrefix('documents',null,null,{delimiter:'/'},(error,result,response)=>{
console.log(result);
console.log(response.body.EnumerationResults.Blobs.BlobPrefix);
})
With delimiter /, listing operation returns results of two parts.
result, contains info of blobs under the root directory of container, e.g. 201801_Review.pdf, etc. in your case.
BlobPrefix in response body, contains directory names of single level with delimiter.
[ { Name: 'docx/' }, { Name: 'xlsx/' } ]
Use BlobPrefix as prefix, we can continue listing content of current subdirectory.
blobService.listBlobsSegmentedWithPrefix('documents','docx/',null,{delimiter:'/'},(error,result,response)=>{
console.log(result);
console.log(response.body.EnumerationResults.Blobs.BlobPrefix);
})
Basically point 1 result is enough, you don't necessarily have to use BlobPrefix to refactor your code. See more info in section Using a Delimiter to Traverse the Blob Namespace of list blobs.
You can also do this with out the overhead of the whole storage api using a fetch request as follows.
fetch("https://cvworkshop.blob.core.windows.net/telaviv-bw/?restype=container&comp=list")
.then(response => response.text())
.then(str => new window.DOMParser().parseFromString(str, "text/xml"))
.then(data => console.log(data));
I am working on a simple website. It has to search quite a few text files in different sub-folders. The rest of the page uses jquery, so I would like to use it for this also. The function I am looking at is .get() for downloading the files. So my main question is, can I test this on my local computer (Ubuntu Linux) or do I have to have it uploaded to a server?
Also, if there's a better way to go about this, that would be nice to know. However, I'm more worried about getting it working.
Thanks,
Frankie
PS: Heres the JS/jQuery code for downloading the files to an array.
g_lists = new Array();
$(":checkbox").each(function(i){
if ($(this).attr("name") != "0")
{
var path = "../" + $(this).attr("name") + ".txt";
$("#bot").append("<br />" + path); // debug
$.get(path, function(data){
g_lists[i] = data;
$("#bot").html(data);
});
}
else
{
g_lists[i] = "";
}
});
Edit: Just a note about the path variable. I think it's correct, but I'm not 100% sure. I'm new to web development. Here's some examples it produces and the directory tree of the site. Maybe it will help, can't hurt.
.
├── include
│ ├── jquery.js
│ └── load.js
├── index.xhtml
├── style.css
└── txt
└── Scripting_Tools
├── Editors.txt
└── Other.txt
Examples of path:
../txt/Scripting_Tools/Editors.txt
../txt/Scripting_Tools/Other.txt
Well I'm a new user, so I can't "answer" my own question, so I'll just post it here:
After asking for help on a IRC chat channel specific to jQuery, I was told I could use this on a local host. To do this I installed Apache web server, and copied my site into it's directory. More information on setting it up can be found here: http://www.howtoforge.com/ubuntu_debian_lamp_server
Then to run the site I navigated my browser to "localhost" and everything works.
You could create a Titanium desktop app and wrap up your JS in there. Titanium seems to eliminate the cross-domain scripting issues when running JS from inside a packaged app.