I have an application on Google App Engine written in python, i am using this application as a server to my web application written in PHP which i am using as client to call my server side api on GAE using javascript end point.
I want to upload file from my web application to my GAE application using google cloud end points.I studied about blob store but it didn't help much.
Please suggest what should i do.
You have to use the Cloud Storage to store your files, instead of Blobstore.
Use GCS because:
Google is moving away from the blobstore
GCS offers more functionality like acl, folders, and more.
You can use filenames and a blobstore like serving url for images
You can create serving url's for non images
GCS is cheap and has a free default bucket
To using GCS, you have to use the https://cloud.google.com/appengine/docs/python/googlecloudstorageclient/
Here is an exemple in Java to use this API:
GcsService gcsService = GcsServiceFactory.createGcsService();
GcsFilename filename = new GcsFilename(CONSTANTES.BUCKETNAME, unique+".jpg");
GcsFileOptions options = new GcsFileOptions.Builder()
.mimeType("image/jpg")
.acl("public-read")
.build();
GcsOutputChannel writeChannel = gcsService.createOrReplace(filename, options);
EDIT: This is written (my mistake) as if you were on java. Feel free to use the analogous pattern for PHP on App Engine.
What the other user wrote about GCS answers the storage portion of your question - you should definitely use GCS - but as far as your idea of POSTing the form data to an endpoints function, this is definitely not advised. API calls should be little tiny pieces of data. An API should be like a paper airplane, lightly flying to your server to request some real data.
The way to have users upload files to GCS is to serve them a page with a file upload form (enctype="multipart/form-data") where the form action parameter is an upload URL generated to template the page in your servlet's doGet(HttpServletRequest req, HttpServletResponse resp) method using the GCS client library function createUploadUrl(). You can use it like this:
String uploadUrl = blobstoreService
.createUploadUrl("/upload",
UploadOptions.Builder.withGoogleStorageBucketName("my_bucket"));
In this manner you can obtain an upload URL for GCS that the user's file POST will go to. The route /upload is a route on your app that will be redirected once the upload is received. Any extra form parameters you add will still be visible to the request handler (a doPost() method on a servlet) which you define for that route.
With all of this info, you're ready to begin serving file upload forms to your users, without worrying about how this interacts with Cloud Endpoints. Use Endpoints to define API calls needed by your javascript/Android/iOS client, not for handling file uploads.
Related
I am new to GAE so struggling to understand a few things.
I am trying to build a python web-app that processes videos uploaded by users (through the webapp) and displays some visualizations (built using d3-js) once the processing is done. The artifacts created during processing are saved locally and later uploaded to user-specific GCS buckets (they are not publically accessible).
I want to be able to display the visualization (using processed video artifacts) when a user requests for it. As per my understanding, since these are dynamically generated, I cannot store the artifacts in static folder for javascript to access. So, it seems that I have to save the processed video artifacts in a /tmp folder.
How do I ensure that javascript is able to fetch files from this external /tmp folder?
Or is there a better way to do this using GCS itself, how do I access buckets from javascript without making them public?
Please suggest some resources or ideas to solve this. Thanks!
I think you've got it backwards.
You have a private bucket, that's great for security. In order to have the client javascript (browser, mobile App) to download an object you need to either:
Have a HTTP handler for your python GAE that retrieves the file from GCS and sends it to the client. (flask pseudo code)
#app.route('/private/<name>')
def hello_name(name):
## if user is not authorized
#### return 401.
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(name)
bts = blob.download_as_bytes()
return bts
Give the client a Signed URL from GCS so they can download the file directly.
#app.route('/private/<name>')
def hello_name(name):
## if user is not authorized
#### return 401.
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(name)
url = blob.generate_signed_url(
version="v4",
# This URL is valid for 15 minutes
expiration=datetime.timedelta(minutes=15),
# Allow GET requests using this URL.
method="GET",
)
return url
As a note, in the second case the javascript file will need to first access /prvite/your_file_name to retrieve the signed url, then it will need to download the actual file from GCS using the signed url.
I'm trying to test out if PHP works from my Firebase hosting using the following:
(index.html)
<form action="welcome.php" method="post">
<input type="submit">
</form>
(welcome.php)
<?php
$to = "my#email.com";
$subject = "My subject";
$txt = "Hello world!";
$headers = "From: dummy#email.com";
mail($to,$subject,$txt,$headers);
?>
Every time I try this the browser keeps on attempting to open the PHP file rather than processing it. Is simple PHP enabled on the Firebase server hosting to process a simple form like this? If I can get it to work this way, I will be building the form out correctly including validation etc.
Thanks,
From the Firebase Hosting site (emphasis mine):
We deliver all of your static content (html, js, images, etc.) over a secure SSL connection and serve it on a CDN.
Firebase Hosting is for hosting static assets. Firebase currently doesn't offer any way to execute your code on Firebase's servers.
Update (2018-08-08): You can now run Node.js/JavaScript code but connecting your Firebase Hosting project to Cloud Functions + Firebase Hosting. But that still won't allow you to run PHP code.
As per the latest update firebase has started using Cloud Functions
Cloud Functions for Firebase lets you run mobile backend code that automatically responds to events triggered by Firebase features and HTTPS requests. Your code is stored in Google’s cloud and runs in a managed environment. There's no need to manage and scale your own servers.
For more : https://firebase.google.com/docs/functions/
There is no PHP but nodeJS available for server-side scripting ...
Google Cloud Functions are written in JavaScript, and execute in
a Node.js runtime.
Mandrill also supports nodeJS and it features a Webhooks API.
Therefore, one can require that node module within these "cloud functions" and "web hooks" ...and then post with a HTML form onto them.
There would need to be a few HTTP cloud functions defined on the Firebase Console, in order to let them subscribe, unsubscribe and manage their subscriptions. One could even generate the HTML markup for the input form with cloud functions and then attach it. As an example, not tested and no guarantee included:
const functions = require('firebase-functions');
const mandrill = require('mandrill-api/mandrill');
var client = new mandrill.Mandrill('YOUR_API_KEY');
/* TODO: add the user on Firebase, respond through the API */
exports.user_add = functions.https.onRequest((req, res) => {
});
/* TODO: change subscription settings on Firebase, respond through the API */
exports.user_edit = functions.https.onRequest((req, res) => {
});
/* TODO: remove the user on Firebase, respond through the API */
exports.user_remove = functions.https.onRequest((req, res) => {
});
/* optional: generate the HTML markup of the form, send HTTP response */
exports.markup = functions.https.onRequest((req, res) => {
});
One can bind the events of Firebase Auth, to keep two user databases in in-sync (this is not required for Mandrill, but required for MailChimp - no matter whether using the PHP or nodeJS wrapper):
exports.on_user_create = functions.auth.user().onCreate(event => {
const user = event.data;
});
exports.on_user_delete = functions.auth.user().onDelete(event => {
const user = event.data;
});
Firebase on Websites explains it, while there is a Local Emulator for Cloud Functions.
You can play around with any of these: Angular, Ember, Knockout, React,
Node JS. The same thing you PHP code does you can make happen with pretty much any Javascript technologies - just no dynamic language. Also another way to do it is to used an online form providers like Jot Forms or others. You can create and style the form withing you online form account then simply add it to you site. Then when user post it will post to the form. As a result you have a centralized environment not only for you current site but for any others down the road. You can create a web service and post values there - then do whatever you want with them: save them to the database... Otherwords have another server that handles all those things so you can just call it from Firebase hosted sites. Hope that helps
PS: I am currently building a product that is a simplified version of Online Forms to be used on Firebase websites. I am planning to have a few people using for now so if you would like you can email me and I will create an account for you to use it. As long as there is no abuse like sending a bunch of emails - you will be fine!
I am attempting to create a mobile phone application with a javascript / AngularJS frontend that communicats with a node js / express js backend.
I believe that I have properly enabled cors but am not completely certain that it has been done in the correct manner. None of the frontend files are hosted on a server (not even a local one). The node js server is hosted online as well as a mongo db server that it interacts with.
So far I am able to make POST's to my API that create a new user and reflect this in the database. I also have a login that POST's to an authentication function which returns a JSON Web Token (JWT). From here I should be able to put the JWT in the header of requests with the key "Authorization" to get access to the other parts of the API (eg: GET /currentUser).
Attempting to GET /currentUser when the JWT is in the header with postman returns all of the expected data. When I attempt to perform the same GET from my frontend (with JWT in header), I get the following OPTIONS response via firebug: "Reload the page to get source for: MyHostedApi/api/users"
I'm wondering if this is some kind of cors issue, incorrectly set authorization header, bad formatting of the $http.get, etc. Any help is greatly appreciated! I'd be glad to provide any parts of the source that are relevant.
This is what my GET looks like:
$http.get("MyHostedApi/api/users/currentUser")
.success(function(response) {
$scope.userData = response.data.firstName;
});
Introduction
I have a SAP HANA Cloud Platform account. I have also deployed a Java application to the account and created a test servlet, which returns dummy data.
On the other side i have a SAP UI5 application which i develop on the WebIDE. I created also two destinations:
Destination "virtualTEST" is connected to a SAP Backend System (HANA Cloud Connector)
Destination "javaTEST" is connected to my Java servlet application
The neo-app.json is well configured and can obtain data from the test servlet (the dummy data) and data from the SAP Backend System (OData Gateway).
The problem
Now i want to pass variables to the SAP Backend System (virtualTEST) destination, which should not be visible in the frontend to avoid javascript manipulation.
My first thought
My first thought was that i create a new servlet which acts as proxy. In the SAPUI5 i call the servlet from javaTEST destination and pass the "hidden variables" like /testServlet?targetUrl=https://webide-xxx.hana.ondemand.com/sap/opu/odata/TEST_SRV/TEST?$filter=Var eq '{{MYVAR}}' and the Java application replaces {{MYVAR}} with my real variable. Then the target will be loaded (this is also a destination url to my SAPUI5 application). This does not work, i do not know why, but i think the proxy can not obtain data from the destination of an application.
Also, i think this is not the best solution. How can this be solved? Any ideas or best practices? Can destinations be used in Java application? :)
It is not really clear to me what you want to achieve. Of cause you can call destinations from Java.
<resource-ref>
<res-ref-name>myBackend</res-ref-name>
<res-type>com.sap.core.connectivity.api.http.HttpDestination</res-type>
</resource-ref>
import javax.naming.Context;
import javax.naming.InitialContext;
import com.sap.core.connectivity.api.http.HttpDestination;
...
// coding to lookup the destination "myBackend"
Context ctx = new InitialContext();
HttpDestination destination = (HttpDestination)
ctx.lookup("java:comp/env/myBackend");
// coding to call service "myService" on the system configured in the given destination
HttpClient createHttpClient = destination.createHttpClient();
HttpGet get = new HttpGet("myService");
HttpResponse resp = createHttpClient.execute(get);
from the official documentation. in the HttpGet you could set Params if you like.
In my opion your Backend should be so save you don't have to worry about Javascript manipulations, especially not for exposed OData services
Regards
Mathias
I'm developing a single page application with a PHP backend using Slim Framework and a JavaScript frontend client using Backbone.js and came across a situation where I want to log requests that result in a 404 error but knowing that hash fragments are not recorded in the request on the backend. I'm wondering if there is a work-around.
My first thoughts was to have JavaScript write a cookie with the hash fragment and have PHP read that for logging.
For logging I'm using a custom Monolog handler for Doctrine 2.
This is what I use for sending those fragments off to Google Analytics or off to my error reporting system:
var url = Backbone.history.getFragment();
if (!/^\//.test(url)) {
url = "/" + url;
}