What order do Google Chrome Sync Storage 'Get' Commands run? - javascript

I am having a very difficult time getting the timing of a simple counting process down.
When I use console.log and alerts to debug, it seems that the code runs randomly - not in the order it appears in the function.
I'm creating a Google Extension and using sync storage to store settings and retrieve them to update values for the user.
For instance, I'm trying to stuff a string into "localBody" so I can then count the words in the string.
So first up in background.js is a listener that fires when a specific sync variable is changed...and it pulls a value from sync storage:
chrome.storage.onChanged.addListener(function(changes, sync) {if (changes["Clabel"]){alert("Label Changed - start listener update...");
//Get body, label, group from Sync Storage
chrome.storage.sync.get('Cbody', function (results) {
localBody = results.Cbody;
console.log("first pulling body string from sync storage" + localBody);
});
Later in the code localBody is sent off to a sub that counts words in the string:
wordCount = wordCount + countWords(localBody); //Update total word count with latest addition
This code fails ("extensions::uncaught_exception_handler:8 Error in event handler for storage.onChanged: TypeError: Cannot read property 'replace' of undefined") and the console shows that it runs BEFORE the sync storage "get" command.
I have tried to do this basic task a zillion ways and I continually run into the unpredictability of pulling from sync storage. I don't know how to get around this - I need to pull a value and modify it, then stuff it back into sync storage.

Okay, for the bozos like me who are blundering along trying to build Google Extensions with only cursory knowledge of javascript...
The "Get" process is asynchronous. So you need to use callbacks to control when the value you want will be available.
This is a good explanation - I used the example code verbatim:
Chrome.storage.sync.get not storing value in local variable

Related

matrix-js-sdk setup and configuration

I am having some issues trying to connect to a matrix server using the matrix-js-sdk in a react app.
I have provided a simple code example below, and made sure that credentials are valid (login works) and that the environment variable containing the URL for the matrix client is set. I have signed into element in a browser and created two rooms for testing purposes, and was expecting these two rooms would be returned from matrixClient.getRooms(). However, this simply returns an empty array. With some further testing it seems like the asynchronous functions provided for fetching room, member and group ID's only, works as expected.
According to https://matrix.org/docs/guides/usage-of-the-matrix-js-sd these should be valid steps for setting up the matrix-js-sdk, however the sync is never executed either.
const matrixClient = sdk.createClient(
process.env.REACT_APP_MATRIX_CLIENT_URL!
);
await matrixClient.long("m.login.password", credentials);
matrixClient.once('sync', () => {
debugger; // Never hit
}
for (const room of matrixClient.getRooms()) {
debugger; // Never hit
}
I did manage to use the roomId's returned from await matrixClient.roomInitialSync(roomId, limit, callback), however this lead me to another issue where I can't figure out how to decrypt messages, as the events containing the messages sent in the room seems to be of type 'm.room.encrypted' instead of 'm.room.message'.
Does anyone have any good examples of working implementations for the matrix-js-sdk, or any other good resources for properly understanding how to put this all together? I need to be able to load rooms, persons, messages etc. and display these respectively in a ReactJS application.
It turns out I simply forgot to run startClient on the matrix client, resulting in it not fetching any data.

Firebase concurrency issue: how to prevent 2 users from getting the same Game Key?

DATABASE:
SITUATION:
My website sells keys for a game.
A key is a randomly generated string of 20 characters whose uniqueness is guaranteed (not created by me).
When someone buys a key, NTWKeysLeft is read to find it's first element. That element is then copied, deleted from NTWKeysLeft and pasted to NTWUsedKeys.
Said key is then displayed on the buyer's screen.
PROBLEM:
How can I prevent the following problem :
1) 2 users buy the game at the exact same time.
2) They both get the same key read from NTWKeysLeft (first element in list)
3) And thus both get the same key
I know about Firebase Transactions already. I am looking for a pseudo-code/code answer that will point me in the right direction.
CURRENT CODE:
Would something like this work ? Can I put a transaction inside another transaction ?
var keyRef = admin.database().ref("NTWKeysLeft");
keyRef.limitToFirst(1).transaction(function (keySnapshot) {
keySnapshot.forEach(function(childKeySnapshot) {
// Key is read here:
var key = childKeySnapshot.val();
// How can I prevent two concurrent read requests from reading the same key ? Using a transaction to change a boolean could only happen after the read happens since I first need to read in order to know which key boolean to change.
var selectedKeyRef = admin.database().ref("NTWKeysLeft/"+key);
var usedKeyRef = admin.database().ref("NTWUsedKeys/"+key);
var keysLeftRef = admin.database().ref("keysLeft");
selectedKeyRef.remove();
usedKeyRef.set(true);
keysLeftRef.transaction(function (keysLeft) {
if (!keysLeft) {
keysLeft = 0;
}
keysLeft = keysLeft - 1;
return keysLeft;
});
res.render("bought", {key:key});
});
});
Just to be clear: keyRef.limitToFirst(1).transaction(function (keySnapshot) { does not work, but I would like to accomplish something to that effect.
Most depends on how you generate the keys, since that determines how likely collisions are. I recommend reading about Firebase's push IDs to get an idea how unique those are and compare that to your keys. If you can't statistically guarantee uniqueness of your keys or if statistical uniqueness isn't good enough, you'll have to use transactions to prevent conflicting updates.
The OP has changed the question a bit so, i will update the answer as follows: I will leave the bottom part about transactions as it was and will put the new update on top.
I can see two ways to proceed:
1) handle the lock system on your own and use JavaScript callbacks or other mechanisms for preventing simultaneous access to a portion of the code.
or
2) Use transactions/fireBase. On this case, i don't have the setup ready to share code other than sample/pseudo code provided at the bottom of this page.
With respect to option 1 above:
I have coded a use-case and put in on plunker. It uses JavaScript callbacks to queue users as they try to access the part of the code under lock.
I. user comes in and he is placed in queue
II. It then calls the callback function which pops users as
first come first out bases. I have the keys on top of the page to
be shared by the functions.
I have a button click event to this and when you click the button twice quickly, you will see keys assigned and they're different keys.
To read this code, click on the script.js file on the left and read starting from the bottom of the page where it calls the functions.
Here is the sample code in plunker. After clicking it, click on Run on top of the page and then click on the button on right hand side. Alert will pop up to show which key is given (note, there are two calls back to back to show two users coming in at same time)
https://plnkr.co/edit/GVFfvqQrlLeMaKlo5FCj?p=info
The fireBase transactions:
Use fireBase transactions to prevent concurrent read/write issues - below is the transaction() method signiture
transaction(dataToBeWritten, onComplete, applyLocally) returns fireBase.promise containing {
committed: boolean, nullable fireBase.database.snapshot }
Note, transaction needs writeOperation as first parameter and in your case looks like you’re removing a key upon success! hence the following function to be called in place of write
Try this pseudo code :
//first, get reference to your db
var selectedKeyRef = admin.database().ref("NTWKeysLeft/"+key);
// needed by transaction as first parameter
function writeOperation() {
selectedKeyRef.remove();
}
selectedKeyRef.transaction(function(writeOperation) , function(error,
committed, snapshot) {
  if (error) {
    console.log('Transaction failed abnormally!', error);
  } else if (!committed) {
    console.log('We aborted the transaction (because xyz).’);
  } else {
    console.log(‘keyRemoved!’);
  }
  console.log(“showKey: ", snapshot.val());
}); // end of the transaction() method call
Docs + to see parameters/return objects of the transaction() method see:
https://firebase.google.com/docs/reference/js/firebase.database.Reference#transaction
In the Docs.... If another client writes to the location before your new value is successfully written, your update function is called again with the new current value, and the write is retried.
https://firebase.google.com/docs/database/web/read-and-write#save_data_as_transactions
I don't think the problem you're worried about can happen. JavaScript, including Node, is single-threaded and can only do one thing at a time. If you had a big server infrastructure with more than one server running this code, then it would be possible, but for a single Node program, there's no problem.
Since none of the previous answers discussing the scope of Transactions worked out, I would suggest a different workaround.
Is it possible to trigger the unique code generation when someone buys a code? If yes, you could generate the unique string if the "buy" button is clicked, display the ID and save the ID to your database.
Later the user enters the key in your game, which checks if the ID is written in your database. This might probably also save a bit of data, since you do not need to keep track of the unique IDs before they get bought and you will also not run out of IDs, since they will always get generated when necessary.

Saving and accessing API results in a variable in Chrome extension

I'm making a Chrome extension that pulls a large amount of data from an API and uses it to modify content on a page. Because the amount of data is so large, I'd like to be able to save it once in the browser and be able to access it instead of doing an API call each time a page loads.
It's my understanding this can be done by putting the API call in the background page and then calling the variable from the background page in the content script. I've also tried storing the data in local storage. Neither method is working for me and I don't understand what I'm doing wrong.
In my code in the background page, I have the API results stored in a variable. I'm calling it in the content script like this:
var background = chrome.extension.getBackgroundPage();
var myData = background.APIData; //where APIData is the variable I set in the background page
My attempt to use local storage looks like this:
//Background page
chrome.storage.sync.set({'value': APIData}, function() {
// Notify that we saved.
console.log('APIData saved to storage');
});
//Content script
var myData = localStorage["APIData"];
As of this moment, the extension isn't even loading in the page using the code where I'm trying to access local storage. The extension will load with the other method but the data doesn't seem to be there. I know my API call is working because the extension works when I put it all in the content script. But that creates the problem where I'm calling the API each time the page loads. Help please!
Examine your code closely. You are setting the data to two different bins.
First, you call chrome.storage.sync.set and set "value" to the value of APIData.
Second, you call localStorage['APIData'] which refers to a separate storage bin and property name.
Two solutions here, the first being the preferred way to set/get persistent data in a Chrome extension:
A1) set it with chrome.storage.sync.set( { value:data } )
A2) get it with chrome.storage.sync.get( null, function( storage ){ storage.value } )
B1) set it with localStorage.value = data
B2) get it with localStorage.value

Cloud code not working

i am new at parse..trying code..trying to run a trigger required in my project.but not able to track not even i am getting any error.
i am using cloud code i.e triggers...
what i want to do is, after update or save i want to run a trigger which will a column in a class with value of 200.
Parse.initialize('APPLICATION_ID', 'JAVASCRIPT_KEY');
Parse.Cloud.afterSave("match_status", function(request)
{
var query = new Parse.Query('Wallet');
query.set("wallet_coines_number", 200);
query.equalTo("objectId", "FrbLo6v5ux");
query.save();
});
i am using afterSave trigger in which match_status is my trigger name. after that i making a object called query of Wallet class. This object will set column 'wallet_coines_number' with the value 200 where objectId is FrbLo6v5ux. after that i used save function which will execute query.
Please guide me if i am wrong, or following wrong approach.
Thank You !
Have you read the Parse documentation on Cloud Code ?
The first line of your code is only relevant when you are initialising Parse JavaScript SDK in a web page, you do not need to initialise anything in Parse cloud code in the main.js file. Also you cannot use a query to save/update an object. A query is for searching/finding objects, when you want to save or update an object you need to create a Parse.Object and save that.
So you code should become something like:
Parse.Cloud.afterSave("match_status", function(request) {
var wallet = new Parse.Object('Wallet');
wallet.set("wallet_coines_number", 200);
wallet.set("objectId", "FrbLo6v5ux");
wallet.save();
});

Using AngularJS to process custom localStorage data

I wrote a bookmarklet that retrieves information from a page and stores it in JSON format in local storage (converting it to a string first, of course).
I would like a web app I am writing to be able to process this data, on the fly, preferably as it gets saved to the localStorage.
Right now i can change the item in LS via the console and refresh the page and the new data appears but I would like it to be live and seamless.
Any advice on how to go about this? I found several localStorage modules for angularJS and I tried them but they don't seem to allow me to retrieve from LS if the data is already there in LS.
In response to answer:
$scope.$watch(
function(){
return $window.localStorage.getItem('TestData');
},
function(newValueInStorage){
$scope.testingLS = newValueInStorage;
}
)
I tried this and I still get the data displayed by just doing a {{ testingLS }} in the view template but when I go and change the TestData key in local storage via the console it doesn't update instantly. (for now, I am just testing it without the bookmarklet with just a simple string inside TestData
There is few ways to do it
One of will be to populate correct model on scope when saving to localStorage
The other that I can think of at this moment is to setup watcher
$watch(
function(){
return localstorage object
},
function(newValueInStorage){
$scope.modelFromLS = JSON.parse(newValueInsStorage)
}
)
---edit---
as per James comment you need something that will handle the fact that data has changed in different tab and $digest process need to run for watch to be recalculated
http://plnkr.co/edit/zlS3wL65meBeA8KkV5KH?p=preview
window.addEventListener('focus', function(){
console.log('focus')
$scope.$digest()
})

Categories

Resources