I've been using Workbox for a few days and i correctly set it up for generating a service worker from a source and dont let Workbox generate it for me.
It works ok but now im trying to include the workbox-background-sync module for storing some failed POST requests and i cant get it to work.
After running the SW, i get "Uncaught TypeError: Cannot read property 'QueuePlugin' of undefined" on the first backgroundSync line (line number 9). This is the generated ServiceWorker file:
importScripts('workbox-sw.prod.v2.1.2.js');
importScripts('workbox-background-sync.prod.v2.0.3.js');
const workbox = new WorkboxSW({
skipWaiting: true,
clientsClaim: true
})
let bgQueue = new workbox.backgroundSync.QueuePlugin({
callbacks: {
replayDidSucceed: async(hash, res) => {
self.registration.showNotification('Background sync demo', {
body: 'Product has been purchased.',
icon: '/images/shop-icon-384.png',
});
},
replayDidFail: (hash) => {},
requestWillEnqueue: (reqData) => {},
requestWillDequeue: (reqData) => {},
},
});
const requestWrapper = new workbox.runtimeCaching.RequestWrapper({
plugins: [bgQueue],
});
const route = new workbox.routing.RegExpRoute({
regExp: new RegExp('^https://jsonplaceholder.typicode.com'),
handler: new workbox.runtimeCaching.NetworkOnly({requestWrapper}),
});
const router = new workbox.routing.Router();
router.registerRoute({route});
workbox.router.registerRoute(
new RegExp('.*\/api/catalog/available'),
workbox.strategies.networkFirst()
);
workbox.router.registerRoute(
new RegExp('.*\/api/user'),
workbox.strategies.networkFirst()
);
workbox.router.registerRoute(
new RegExp('.*\/api/security-element'),
workbox.strategies.networkOnly()
);
workbox.precache([]);
I've tried to load it with workbox.loadModule('workbox-background-sync'), a workaround i've found on github, but it still not working. Also, trying with
self.workbox = new WorkboxSW(), with the same faith.
P.S: Is there a way i can hook a function AFTER a strategy like networkFirst has failed and its going to respond with cache? because i want to know, if im getting a cached response i would like to tell the user that by modifying the incoming response and handle it later in Vue for example. Thanks for reading!
I solved it.. it was actually pretty silly, i was accesing the const workbox instead of instantiating a new workbox.backgroundSync, i fixed it by simply renaming the const workbox to const workboxSW
Related
I'm currently working on web3auth authentication, In their documentation most of the code is in typescript. But I'm using javascript in my Next.js project. I'm using core packages to build custom ui interface. But getting this wallet error when I tried to login using walletAdapters
This is my code for that
const web3AuthCoreCtorParams = {
clientId,
chainConfig: {
chainNamespace: CHAIN_NAMESPACES.EIP155,
chainId: "0x1",
rpcTarget:
"https://rinkeby.infura.io/v3/9aa3d95b3bc440fa88ea12eaa4456161", // This is the testnet RPC we have added, please pass on your own endpoint while creating an app
},
};
// creating new web3auth instance
const web3AuthInstance = new Web3AuthCore(web3AuthCoreCtorParams);
subscribeAuthEvents(web3AuthInstance);
const openloginAdapter = new OpenloginAdapter({
adapterSettings: {
clientId,
network: "testnet",
uxMode: "redirect",
},
});
web3AuthInstance.configureAdapter(openloginAdapter);
const web3authProvider = await web3AuthInstance.connectTo(
WALLET_ADAPTERS.OPENLOGIN,
{
relogin: true,
loginProvider: "discord",
}
);
setProvider(web3authProvider);
};
And the error I'm getting is
Unhandled Runtime Error
WalletLoginError: Failed to connect with walletFailed to login with openlogin
Call Stack
Function.fromCode
node_modules/#web3auth/base/dist/base.esm.js (315:0)
Function.connectionError
node_modules/#web3auth/base/dist/base.esm.js (320:0)
OpenloginAdapter.connect
node_modules/#web3auth/openlogin-adapter/dist/openloginAdapter.esm.js (141:12)
async Web3AuthCore.connectTo
node_modules/#web3auth/core/dist/core.esm.js (105:0)
I'm using version 1.0.0
I want upload a file with evaporate.js and crypto-js using x-amz-security-token:
import * as Evaporate from 'evaporate';
import * as crypto from "crypto-js";
Evaporate.create({
aws_key: <ACCESS_KEY>,
bucket: 'my-bucket',
awsRegion: 'eu-west',
computeContentMd5: true,
cryptoMd5Method: data => crypto.algo.MD5.create().update(String.fromCharCode.apply(null, new Uint32Array(data))).finalize().toString(crypto.enc.Base64),
cryptoHexEncodedHash256: (data) => crypto.algo.SHA256.create().update(data as string).finalize().toString(crypto.enc.Hex),
logging: true,
maxConcurrentParts: 5,
customAuthMethod: (signParams: object, signHeaders: object, stringToSign: string, signatureDateTime: string, canonicalRequest: string): Promise<string> => {
const stringToSignDecoded = decodeURIComponent(stringToSign)
const requestScope = stringToSignDecoded.split("\n")[2];
const [date, region, service, signatureType] = requestScope.split("/");
const round1 = crypto.HmacSHA256(`AWS4${signParams['secret_key']}`, date);
const round2 = crypto.HmacSHA256(round1, region);
const round3 = crypto.HmacSHA256(round2, service);
const round4 = crypto.HmacSHA256(round3, signatureType);
const final = crypto.HmacSHA256(round4, stringToSignDecoded);
return Promise.resolve(final.toString(crypto.enc.Hex));
},
signParams: { secretKey: <SECRET_KEY> },
partSize: 1024 * 1024 * 6
}).then((evaporate) => {
evaporate.add({
name: 'my-key',
file: file, // file from <input type="file" />
xAmzHeadersCommon: { 'x-amz-security-token': <SECURITY_TOKEN> },
xAmzHeadersAtInitiate: { 'x-amz-security-token': <SECURITY_TOKEN> },
}).then(() => console.log('complete'));
},
(error) => console.error(error)
);
but it produce this output
AWS Code: SignatureDoesNotMatch, Message:The request signature we calculated does not match the signature you provided. Check your key and signing method.status:403
What I'm doing wrong
SIDE NOTE
This is the versione I'm using on browser side:
{
"crypto-js": "^4.1.1",
"evaporate": "^2.1.4"
}
You have your crypto.HmacSHA256 parameters reversed. They should all be the other way around. I've been bashing my head against a wall trying to get evaporate 2.x to work for the last week, it's been very frustrating.
I tried your code above and looked over all the docs and forum posts related to this, and I think using Cognito for this auth just doesn't work or isn't obvious how it's supposed to work, even though the AWS docs suggest it's possible.
In the end I went with using Lambda authentication and finally got it working after seeing much misinformation about how to use various crypto libraries to sign this stuff. I got it working last night after rigorously examining every bit of what was going on. Reading the docs also helped get me on the right path as to how the crypto needs to work, it gives example inputs and outputs so you can test for sure that your crypto methods are working as AWS expects them to work:
https://docs.aws.amazon.com/general/latest/gr/sigv4_signing.html
Tasks 1, 2 and 3 are especially important to read and understand.
I have a React application that calls the Places API through Google's dedicated places library.
The library is imported as such:
<script defer src="https://maps.googleapis.com/maps/api/js?key=<API_KEY>&libraries=places&callback=initPlaces"></script>
The code above is inside /public, in index.html. The initPlaces callback, specified in the URL looks as such:
function initPlaces() {
console.log("Places initialized");
}
To make the actual request, the following code is used:
async function makeGapiRequest() {
const service = new window.google.maps.places.AutocompleteService();
const response = await service.getQueryPredictions({
input: "Verona"
});
console.log(res);
}
For testing purposes, the function is called when the document is clicked:
document.addEventListener("click", () => {
makeGapiRequest();
});
On every request, there is a response coming back. For instance, when the input has the value of Verona, the following response is received, and is only visible in the browser network tab:
{
predictions: [
{
description: "Verona, VR, Italy",
...
},
...
],
status: "OK"
}
Whenever maleGapiRequest is called, even though there is a visible response from the API, the response variable is undefined at the time of logging, and the following error is thrown in the console:
places_impl.js:31 Uncaught TypeError: c is not a function
at places_impl.js:31:207
at Tha.e [as l] (places_impl.js:25:320)
at Object.c [as _sfiq7u] (common.js:97:253)
at VM964 AutocompletionService.GetQueryPredictionsJson:1:28
This code is thrown from the Places library imported in /public/index.html.
Did anyone encounter this error before, or has an idea as to what is the problem? I would like it if the solution was available to me, not the library.
The problem was that I was calling the wrong method. Instead of getQueryPredictions call the getPlacePredictions method. It will return different results, but you can configure it to suite your needs.
Old code:
async function makeGapiRequest() {
const service = new window.google.maps.places.AutocompleteService();
const response = await service.getQueryPredictions({
input: "Verona"
});
console.log(res);
}
New code:
async function makeGapiRequest() {
const service = new window.google.maps.places.AutocompleteService();
const response = await service.getPlacePredictions({
input: "Verona",
types: ["(cities)"]
});
console.log(res);
}
I am using the following minimal probot app and try to write Mocha unit tests for it.
Unfortunately, it results in the error below, which indicates that some of my setup for the private key or security tokens is not picked up.
I assume that the configuration with my .env file is correct since I do not get the same error when I start the probot via probot-run.js.
Are there any extra steps needed to configure probot when used with Mocha?
Any suggestions on why the use of the scheduler extension may result in such issue would be great.
Code and error below:
app.ts
import createScheduler from "probot-scheduler";
import { Application } from "probot";
export = (app: Application) => {
createScheduler(app, {
delay: !!process.env.DISABLE_DELAY, // delay is enabled on first run
interval: 24 * 60 * 60 * 1000 // 1 day
});
app.on("schedule.repository", async function (context) {
app.log.info("schedule.repository");
const result = await context.github.pullRequests.list({owner: "owner", repo: "test"});
app.log.info(result);
});
};
test.ts
import createApp from "../src/app";
import nock from "nock";
import { Probot } from "probot";
nock.disableNetConnect();
describe("my scenario", function() {
let probot: Probot;
beforeEach(function() {
probot = new Probot({});
const app = probot.load(createApp);
});
it("basic feature", async function() {
await probot.receive({name: "schedule.repository", payload: {action: "foo"}});
});
});
This unfortunately results in the following error:
Error: secretOrPrivateKey must have a value
at Object.module.exports [as sign] (node_modules/jsonwebtoken/sign.js:101:20)
at Application.app (node_modules/probot/lib/github-app.js:15:39)
at Application.<anonymous> (node_modules/probot/lib/application.js:260:72)
at step (node_modules/probot/lib/application.js:40:23)
at Object.next (node_modules/probot/lib/application.js:21:53)
Turns out that new Probot({}); as suggested in the documentation initializes the Probot object without any parameters (the given options object {} is empty after all).
To avoid the error, one can provide the information manually:
new Probot({
cert: "...",
secret: "...",
id: 12345
});
i am using Ember CLI + Ember Data + Simple Auth. The authenticator is working fine. But when im am doing a Rest Call with Ember Data Rest Adapter this.store.findAll("user"); the authorize function in my custom authorizer don't gets called.
The Rest API Endpoint is on an other domain, so i added the url to the crossOriginWhitelist in my environment.js.
environment.js:
module.exports = function(environment) {
var ENV = {
// some configuration
};
ENV['simple-auth'] = {
crossOriginWhitelist: ['http://api.xxxx.com'],
authorizer: 'authorizer:xxxx',
routeAfterAuthentication: 'dashboard',
};
return ENV;
};
authorizer
import Ember from 'ember';
import Base from 'simple-auth/authorizers/base';
var XXXXAuthorizer = Base.extend({
authorize: function(jqXHR, requestOptions) {
// Some Code, gets not called, damn it :(
}
});
export default {
name: 'authorization',
before: 'simple-auth',
initialize: function(container) {
container.register('authorizer:xxxx', XXXXAuthorizer);
}
};
index.html
....
<script>
window.XXXXWebclientENV = {{ENV}};
window.ENV = window.MyAppENV;
window.EmberENV = window.XXXXWebclientENV.EmberENV;
</script>
<script>
window.XXXXWebclient = require('xxxx-webclient/app')['default'].create(XXXXWebclientENV.APP);
</script>
....
Thanks for help :)
I had a similar problem. For me it was the crossOriginWhitelist config.
I set it like this:
// config/environment.js
ENV['simple-auth'] = {
crossOriginWhitelist: ['*'] // <-- Make sure it's an array, not a string
};
to see if I could get it working (I could), then I could narrow it down to figure out exactly what URL I should use to enforce the restriction (port number and hostname etc).
Don't leave it like that though!
You should actually figure out what URL works for the whitelist, and use that.
I am facing the same issue. I have same setup but the authorize function is not being called. May be you can try by adding the port number in your crossOriginWhiteList url.
I am adding window.ENV = window.MyAppENV line in new initializer which runs before simple-auth. You have added that in index file and may be that is the reason why simple-auth is not able to read your configuration.
Does the other configuration routeAfterAuthentication: 'dashboard', works properly? If not then this might be the reason. Try adding new initializer like
export default {
name: 'simple-auth-config',
before: 'simple-auth',
initialize: function() {
window.ENV = window.MyAppNameENV;
}
};