I am using the auto SDK URLs to initialize Firebase:
<script src="/__/firebase/6.1.1/firebase-app.js"></script>
<script src="/__/firebase/6.1.1/firebase-auth.js"></script>
<script src="/__/firebase/6.1.1/firebase-storage.js"></script>
<script src="/__/firebase/6.1.1/firebase-messaging.js"></script>
<script src="/__/firebase/6.1.1/firebase-firestore.js"></script>
<script src="/__/firebase/init.js"></script>
I need to set the authDomain to my business domain which is set up on Firebase so that the Google gmail authentication shows my domain name and not my firebase project name.
Do I have to stop using these URLs and instead construct a config and initialize Firebase manually? Or is there a way to update authDomain separately?
I tried the following to see if I could get the config settings and then I planned to modify the authDomain. I received an error instead.
<script>
fetch('/__/firebase/init.json').then(async response => {
firebase.initializeApp(await response.json());
});
</script>
Here is the error from the developer log:
(index):195 Uncaught (in promise) TypeError: Failed to execute 'json' on 'Response': body stream is locked
at (index):195
(anonymous) # (index):195
errors.ts:137 Overwriting FirebaseError base field "name" can cause unexpected behavior.
[UPDATE}
Seems that fetch is asynchronous and that all the scripts load asynchronously as well. Bottom line its darn near impossible to get the config information programmatically, change authDomain and then call initializeApp before loading my framework scripts and initializing my main app which relies on firebase already being there.
Here is the async version i tried:
await fetch('/__/firebase/init.json').then(async (response)=> {
let config = await response.json();
config[authDomain] = 'mydomain.com';
await firebase.initializeApp(config);
});
You should be able to use this, you just need to wait for the fetch() promise to resolve before attempting to do anything with Firebase.
For example
const firebaseReady = fetch('/__/firebase/init.json')
.then(res => res.ok ? res.json() : Promise.reject(res))
.then(config => {
// do whatever you need to do with config
return firebase.initializeApp(config)
})
Now you just need to chain on to the firebaseReady promise. For example
firebaseReady.then(app => {
const db = app.firestore()
const auth = app.auth()
// do other things here
})
Related
So I have a peculiar situation here. I have built a widget with Vue 3 and bundled it as an IIFE in order to isolate the code from the client, so the end users will be able to download a script from a CDN and change some parameters at will via HTML data attributes within the script, something like this:
!-- Widget -->
<script src="https://cdn.xyz.xyz/widget.min.js"
data-icon="path_to_icon.jpg"
data-title="Brand support"
data-jwt-url="jwt token url"
data-rubiko-url="another url"
data-company-websocket-url="websocket url"
data-theme="default"
data-brand-id="brand-id">
</script>
<!-- end of Widget -->
The widget is working fine and all the HTML attributes coming from the script can be customized. There's only one requirement that is kind of confusing to me. As you can see one of the data attributes (data-jwt-url) is an endpoint. The Widget will fetch the token from that URL and then use it to retrieve some data from an API. But I need to expose a global function that the end user will be able to use in any part of their code in order to retrieve a token at any point within their code.
In Vue this function is like this:
const retrieveSignedJWT = async () => {
if (jwtUrl.value) {
fetch(jwtUrl.value)
.then(async (response) => {
const data = await response.text();
// check for error response
if (!response.ok) {
// get an error message from body or default to response statusText
const error = (data && data.message) || response.statusText;
return Promise.reject(error);
}
let jwt = data;
token.value = data;
decodeToken(jwt);
retrieveCategories();
})
.catch((error) => {
error.value = error;
console.error("There was an error!", error);
});
} else {
console.log("The JWT URL coming from the script is corrupted");
}
};
Then I call that function from the onMounted Hook:
onMounted(async () => {
await retrieveSignedJWT();
});
By doing that as soon as the Widget has loaded into the client site, the widget will retrieve a Token. The question is how can I expose the retrieveSignedJWT function via Vue so the client can call it from anywhere in their code something like having in mind that the production code is an IIFE:
widget.retrieveSignedJWT();
I'm having a hard time understanding the whole token part of Firebase uploads.
I want to simply upload use avatars, save them to the database and then read them at the client side.
const storageRef = firebase.storage().ref();
storageRef.child(`images/user-avatars/${user.uid}`).put(imageObj);
Then, in my cloud function, I grab the new url like this:
exports.writeFileToDatabase = functions.storage.object().onFinalize(object => {
const bucket = defaultStorage.bucket();
const path = object.name as string;
const file = bucket.file(path);
return file
.getSignedUrl({
action: "read",
expires: "03-17-2100"
})
.then(results => {
const url = results[0];
const silcedPath = path.split("/");
return db
.collection("venues")
.doc(silcedPath[1])
.set({ images: FieldValue.arrayUnion(url) }, { merge: true });
});
});
I've enabled IAM in the Google APIs platform, and have added Cloud functions service agent to the App Engine default service account.
I feel like the exact same configuration has worked before, butt now it sometimes doesn't even write the new url or I get 403 trying to read it. I can't find any explanations or errors to what I'm doing wrong.
EDIT:
Forgot to add this piece of code, but FieldValue is set at the top of the document as
const FieldValue = admin.firestore.FieldValue;
EDIT:
This the exact error I get Failed to load resource: the server responded with a status of 403 ()
And I just got it when I've tried to use this link, which has been generated automatically by the function above, as the source for an image component:
https://storage.googleapis.com/frothin-weirdos.appspot.com/images/user_avatars/yElCIVY4bAY5g5LnoOBhqN6mDNv2?GoogleAccessId=frothin-weirdos%40appspot.gserviceaccount.com&Expires=1742169600&Signature=qSqPuuY4c5xmdnpvfZh39Pw3Vyu2B%2FbGMD1rQwHDBUZTAnKwP11MaOFQt%2BTV53krkIgvJgQT0Xl3UUxkngmW9785fUri75SSPoBk0z4DKyZnEBLxgTGRE8MzmXadQ%2BHDJ3rSI8IkkoomdnANpLsPN9oySshZ1h4BfOBvAmK0hQ4Gge1glH7qhxFjVWfX3tovZoL8e2smhuCRXxDsZtJh0ihbIeZUEnX8lGic%2B9IT6y4OskS2ZlrZNjvM10hcEesoPdHsT4oCvfhCNbUcJcueRKfsWlDCd9m6qmf42WVOc7UI0nE0oEvysMutWY971GVRKTLwIXRnTLSNOr6fSvJE3Q%3D%3D
Following this: https://medium.com/#nedavniat/how-to-perform-and-schedule-firestore-backups-with-google-cloud-platform-and-nodejs-be44bbcd64ae
Code is:
const functions = require('firebase-functions'); // is installed automatically when you init the project
const { auth } = require('google-auth-library'); // is used to authenticate your request
async function exportDB () {
const admin = await auth.getClient({
scopes: [ // scopes required to make a request
'https://www.googleapis.com/auth/datastore',
'https://www.googleapis.com/auth/cloud-platform'
]
});
const projectId = await auth.getProjectId();
const url = `https://firestore.googleapis.com/v1beta1/projects/${projectId}/databases/(default):exportDocuments`;
return admin.request({
url,
method: 'post',
data: {
outputUriPrefix: 'gs://name-of-the-bucket-you-created-for-backups'
}
});
}
const backup = functions.pubsub.topic('YOUR_TOPIC_NAME_HERE').onPublish(exportDB);
module.exports = { backup };
When I go to deploy via:
gcloud functions deploy backup --runtime nodejs8 --trigger-topic YOUR_TOPIC_NAME_HERE
I get error:
ERROR: (gcloud.functions.deploy) OperationError: code=3,
message=Function failed on loading user code. Error message: Code in
file index.js can't be loaded. Is there a syntax error in your code?
Detailed stack trace: TypeError: Cannot read property 'user' of
undefined
Is this something with google-auth-library?
I assume that you are trying to deploy GCF function triggered by HTTP request, I suggest you to check this link[1] seems is the same use case and can help you to use Google Cloud Datastore with node.js on GCF
[1] How to return an entire Datastore table by name using Node.js on a Google Cloud Function
I am trying to unit test an API call to ensure that it has been called with the correct properties. This API call depends on Stripe's external library that is attached to the window via index.html src="http://stripe[...]". I get window.[...] is not a function.
I successfully mocked the $http.post request, but in the successful callback from Stripes payment, it redirects the user back by calling window.Stripe().redirectToCheckout(). I managed to mock window.Stripebut had difficulty with .redirectToCheckout() and was unsure of the correct way to go about it.
index.html:
<script src="https://js.stripe.com/v3/"></script>
<link rel="preconnect" href="https://q.stripe.com">
<link rel="preconnect" href="https://checkout.stripe.com">
StripePayment.vue
async stripe () {
await this.$http.post(process.env.VUE_APP_PAYMENTAPI + 'api/session/', {
amount: this.cost,
}).then(response => {
// Redirect to the main page by using the sessionId provided by stripes response.
window.Stripe(process.env.VUE_APP_STRIPE_KEY).redirectToCheckout({
sessionId: response.body
})
}, response => {
this.paymentFailed(response)
})
}
StripePayment.spec.js
let stripeSpy = sinon.spy(StripePayment.methods, 'stripe')
sinon.assert.calledOnce(stripeSpy)
I expect to be able to check that the API call has been called successfully. Unfortunately, I get the following error message - "UnhandledPromiseRejectionWarning: TypeError: window. Stripe is not a function". If I stub window. Stripe, then I get a similar error with .redirectToCheckout() and it was at this point where I struggled to stub.
There is some similar code to mine that has been posted that can be located here - https://repl.it/#AndrewReinlieb/Checkout-Test.
For proper isolated unit testing, all units that don't belong to tested unit should be mocked. In case a unit belongs to external library, it should be mocked on window:
const stripeMock = sinon.stub(window, 'stripe');
const redirectToCheckoutMock = sinon.stub();
stripeMock.returns({ redirectToCheckout: redirectToCheckoutMock });
I'm using some service worker code from the Progressive Web app tutorial by Google but I am getting an error:
Uncaught (in promise) TypeError:
Failed to execute 'clone' on 'Response':
Response body is already used
The site uses third-party Javascript and stylesheets for web fonts.
I want to add assets hosted on these CDNs to the offline cache.
addEventListener("fetch", function(e) {
e.respondWith(
caches.match(e.request).then(function(response) {
return response || fetch(e.request).then(function(response) {
var hosts = [
"https://fonts.googleapis.com",
"https://maxcdn.bootstrapcdn.com",
"https://cdnjs.cloudflare.com"
];
hosts.map(function(host) {
if (e.request.url.indexOf(host) === 0) {
caches.open(CACHE_NAME).then(function(cache) {
cache.put(e.request, response.clone());
});
}
});
return response;
});
})
);
});
These are hosted on popular CDNs, so my hunch is they should be doing the right thing for CORS headers.
Here are the assets in the HTML that I want to cache:
<link rel="stylesheet" type="text/css"
href="https://fonts.googleapis.com/css?family=Merriweather:900,900italic,300,300italic">
<link rel="stylesheet" type="text/css"
href="https://fonts.googleapis.com/css?family=Lato:900,300" rel="stylesheet">
<link rel="stylesheet" type="text/css"
href="https://maxcdn.bootstrapcdn.com/font-awesome/latest/css/font-awesome.min.css">
<script type="text/javascript" async
src="https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.1/MathJax.js?config=TeX-AMS-MML_HTMLorMML">
</script>
According to the console logs, the service worker is trying to fetch these assets:
Fetch finished loading:
GET "https://maxcdn.bootstrapcdn.com/font-awesome/latest/css/font-awesome.min.css".
sw.js:32
Fetch finished loading:
GET "https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.1/MathJax.js?config=TeX-AMS-MML_HTMLorMML".
sw.js:32
Fetch finished loading:
GET "https://fonts.googleapis.com/css?family=Merriweather:900,900italic,300,300italic".
sw.js:32
Fetch finished loading:
GET "https://fonts.googleapis.com/css?family=Lato:900,300".
sw.js:32
Fetch finished loading:
GET "https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.1/config/TeX-AMS-MML_HTMLorMML.js?V=2.7.1".
sw.js:32
Fetch finished loading:
GET "https://maxcdn.bootstrapcdn.com/font-awesome/latest/fonts/fontawesome-webfont.woff2?v=4.7.0".
sw.js:32
If I remove the clone, as was suggested in Why does fetch request have to be cloned in service worker?, I'll get the same error:
TypeError: Response body is already used
If I add { mode: "no-cors" } to the fetch per Service worker CORS issue, I'll get the same error and these warnings:
The FetchEvent for
"https://maxcdn.bootstrapcdn.com/font-awesome/latest/fonts/fontawesome-webfont.woff2?v=4.7.0"
resulted in a network error response: an "opaque" response was
used for a request whose type is not no-cors
The FetchEvent for
"https://fonts.gstatic.com/s/lato/v14/S6u9w4BMUTPHh50XSwiPGQ3q5d0.woff2"
resulted in a network error response: an "opaque" response was
used for a request whose type is not no-cors
The FetchEvent for
"https://fonts.gstatic.com/s/lato/v14/S6u9w4BMUTPHh7USSwiPGQ3q5d0.woff2"
resulted in a network error response: an "opaque" response was
used for a request whose type is not no-cors
The FetchEvent for
"https://fonts.gstatic.com/s/merriweather/v19/u-4n0qyriQwlOrhSvowK_l521wRZWMf6hPvhPQ.woff2"
resulted in a network error response: an "opaque" response was
used for a request whose type is not no-cors
I could add these assets to the static cache in the service worker's install event, but I have reasons to add them to the cache only in the fetch event.
You're on the right track with using clone(), but the timing is important. You need to make sure that you call clone() before the final return response executes, because at that point, the response will be passed to the service worker's client page, and its body will be "consumed".
There are two ways of fixing this: either call clone() prior to executing the asynchronous caching code, or alternatively, delay your return response statement until after the caching has completed.
I'm going to suggest the first approach, since it means you'll end up getting the response to the page as soon as possible. I'm also going to suggest that you rewrite your code to use async/await, as it's much more readable (and supported by any browser that also supports service workers today).
addEventListener("fetch", function(e) {
e.respondWith((async function() {
const cachedResponse = await caches.match(e.request);
if (cachedResponse) {
return cachedResponse;
}
const networkResponse = await fetch(e.request);
const hosts = [
'https://fonts.googleapis.com',
'https://maxcdn.bootstrapcdn.com',
'https://cdnjs.cloudflare.com',
];
if (hosts.some((host) => e.request.url.startsWith(host))) {
// This clone() happens before `return networkResponse`
const clonedResponse = networkResponse.clone();
e.waitUntil((async function() {
const cache = await caches.open(CACHE_NAME);
// This will be called after `return networkResponse`
// so make sure you already have the clone!
await cache.put(e.request, clonedResponse);
})());
}
return networkResponse;
})());
});
Note: The (async function() {})() syntax might look a little weird, but it's a shortcut to use async/await inside an immediately executing function that will return a promise. See http://2ality.com/2016/10/async-function-tips.html#immediately-invoked-async-function-expressions
For the original code, you need to clone the response before you do the asynchronous cache update:
var clonedResponse = response.clone();
caches.open(CACHE_NAME).then(function(cache) {
cache.put(e.request, clonedResponse);
});
The Service Worker primer by Google has example code showing the correct way. The code has a comment with an "important" note, but it's just emphasizing the clone, and not the issue you're having about when you clone:
// IMPORTANT: Clone the response. A response is a stream
// and because we want the browser to consume the response
// as well as the cache consuming the response, we need
// to clone it so we have two streams.
var responseToCache = response.clone();
caches.open(CACHE_NAME)
.then(function(cache) {
cache.put(event.request, responseToCache);
});
return response;