I'm using angularfire and the realtime database
I have a database as so :
actions:{
uid1: {
meta...
},
uid2: {
meta...
},
uid3: {
meta...
},
uid4: {
meta...
},
}
I've made an editor showing all of that actions. Now let say I add an action (uid5) my valueChanges method send me back all the uids when I only need the updated value.
I would do something a bit like this
ngOnInit(){
this.db.object(`actions`).valueChanges().subscribe(
(actions) => console.log(actions) // here I get first time uid1{}, uid2{} ..., uid4{}
//second time after adding uid5 I would get uid5{} only.
)
}
So is it possible, is there some specific event or whatsoever or should I make a feature request?
There is nothing built into Firebase or AngularFire to get only the new items, so you'll have to built something yourself.
The two most common options:
Store a timestamp value in each node, and then query for only items after "now" with something like ref.orderBy("timestamp").startAt(Date.now()).
Start at keys after now, with something like ref.orderByKey().startAt(ref.push().key).
Related
I am using typescript and the Firebase Realtime Database (cannot use Firestore), and I have data in the form on an interface MyData as follows:
enum RunStatus {
RUNNING,
ENDED
}
interface IResults {
firstItem: string;
secondItem: number;
}
interface MyData {
status: RunStatus;
results: IResults;
}
Suppose I have 10 clients that might end up writing this data simultaneously to (1) change status from RUNNING to ENDED and (2) set the results field.
What I want is for only the first client to be able to do this, so I need to use a transaction of some sort.
My data is stored at this path: "/some/path/here/mydata".
As near as I can tell from the limited documentation, my writes should look something like this:
class MyDatabase {
db: Database;
constructor(db_: Database) {
this.db = db_
}
writeMyData(newData: MyData) {
const path="/some/path/here/mydata";
const reference=child(ref(this.db),path);
runTransaction(reference, (currentData) => {
if (currentData) {
if (currentData.status!=RunStatus.ENDED && newData.status==RunStatus.ENDED) {
currentData.status=newData.status;
currentData.results=newData.results;
// Or, I could have just set currentData=newData
}
}
return currentData;
});
}
}
Is this correct? And what exactly does it do if multiple clients try to run this at the same time? The documentation says something about runTransaction being automatically rerun if currentData is updated by another client while the first client is writing. Can someone please explain to me if this is correct, and what is happening here?
I have a doc in couchDB:
{
"id":"avc",
"type":"Property",
"username":"user1",
"password":"password1",
"server":"localhost"
}
I want to write a view that returns a map of all these fields.
The map should look like this: [{"username","user1"},{"password","password1"},{"server","localhost"}]
Here's pseudocode of what I want -
HashMap<String,String> getProperties()
{
HashMap<String,String> propMap;
if (doc.type == 'Property')
{
//read all fields in doc one by one
//get value and add field/value to the map
}
return propMap;
}
I am not sure how to do the portion that I have commented above. Please help.
Note: right now, I want to add username, password and server fields and their values in the map. However, I might keep adding more later on. I want to make sure what I do is extensible.
I considered writing a separate view function for each field. Ex: emit("username",doc.username).
But this may not be the best way to do this. Also needs updates every time I add a new field.
First of all, you have to know:
In CouchDB, you'll index documents inside a view with a key-value pair. So if you index the property username and server, you'll have the following view:
[
{"key": "user1", "value": null},
{"key": "localhost", "value": null}
]
Whenever you edit a view, it invalidates the index so Couch has to rebuild the index. If you were to add new fields to that view, that's something you have to take into account.
If you want to query multiple fields in the same query, all those fields must be in the same view. If it's not a requirement, then you could easily build an index for every field you want.
If you want to index multiple fields in the same view, you could do something like this:
// We define a map function as a function which take a single parameter: The document to index.
(doc) => {
// We iterate over a list of fields to index
["username", "password", "server"].forEach((key, value) => {
// If the document has the field to index, we index it.
if (doc.hasOwnProperty(key)) {
// map(key,value) is the function you call to index your document.
// You don't need to pass a value as you'll be able to get the macthing document by using include_docs=true
map(doc[key], null);
}
});
};
Also, note that Apache Lucene allows to make full-text search and might fit better your needs.
Take a look at the following JSON structure:
"Fund_Project_Request" : {
"-LEEy7uxXEeI4AJuePoB" : {
"4ZpTt0rHvjYfKAnCukIlhGpH6kz2" : {
"afds1234" : 2,
"asdf12" : 2
},
"iRfNzDSjFiOADqn3KsG8nNuZEfp2" : {
"afds1234" : 1
}
}
},
Here, if I want to get the values 'afds1234' or 'asdf12' which I'm going to call as 'reward_ids' in an onWrite function, all I have to do is:
exports.manipulateRewards = functions.database.ref('/Fund_Project_Request/{ArtcallID}/{UserID}/{rewardID}').onWrite((change, context) => {
const reward_id = context.params.rewardID;
});
Let's say I want to obtain these reward_ids strings without using the onWrite function. Would I be able to do so with a singleValueEventListener or any other method of querying?
When writing code to query Realtime Database, there are no wildcards. You must know the exact path of the data you're interested in.
Cloud Functions triggers aren't really anything like normal listeners. They are essentially filtering all writes that flow through the system, and triggering only on the writes that match the given path.
I have a form that consists of a number of multi-select fields. Each select has a unique ID, and are named accordingly:
values[foobar1][]
values[foobar2][]
values[foobar3][]
... etc.
This form could potentially contain hundreds of these fields, and so is paged by ajax. The result of that is that there is no guarantee that all records are going to available at once at the front end. Therefore, it is impossible for me to submit the entire form. I do, however, have access to the entire list of records server-side.
My solution to this was to watch for changes in the form fields and, for every field that is changed, store the values in an array to keep track of just the altered field values. So if you make a change to just foobar2, the resulting serialized array that is sent to the server will look like this:
0: Object {
name: "values[foobar2][]"
value: "thevalue1"
},
1: Object {
name: "values[foobar2][]"
value: "thevalue3"
}
So this works fine except for, as you may have guessed, when the select multiple is emptied. No matter what format I use for storing the altered values, be it arraySerialization of each field or as an associative array, when I pass my array to $.param() for the ajax request the resulting serialized string contains no trace of the empty value. So there is no way for the server to determine that the value has been emptied.
Can anyone suggest a way of either passing the data to the server so that the empt(ied) array remains intact, or another way of dealing with the initial problem.
Thanks in advance!
You want to calculate the diff between current and previous state, send the change to the server, and apply it to the data.
You can do so using the JSON patch standard (rfc6902).
JSON Patch is a format for describing changes to a JSON document. It
can be used to avoid sending a whole document when only a part has
changed. When used in combination with the HTTP PATCH method it allows
partial updates for HTTP APIs in a standards compliant way.
To create the diff you can use an NPM module, such as jiff. A diff is set a of patching commands, that can transform a JSON document. For example:
[
{ "op": "replace", "path": "/values/foobar2/", "value": ["thevalue1"] },
{ "op": "remove", "path": "/values/foobar2/"}
]
You send the diff to the server, and then use a server module, such as php-jsonpatch, to apply the patch to the current data on the server.
Create a single object for all select field values you can use localStorage or sessionStorage to store it. Since the form is in a lot of pages and you use ajax to get each select field. Place the selected values of each field in an array. Creating an object like this is the idea.
{
formValues: {
foobar1: {
values: ["thevalue1","thevalue2"]
},
foobar2: {
values: ["thevalue3"]
},
...
foobarN: {
values: []
}
}
}
Every time you update a select vield value or values make sure to update the localStorage saved value. e.g.
var valuesObject = {
formValues: {
foobar1: {
values: ["thevalue1","thevalue2"]
},
foobar2: {
values: ["thevalue3"]
},
foobar3: {
values: []
}
}
}
// Put the object into storage
localStorage.setItem('valuesObject', JSON.stringify(valuesObject));
// Retrieve the object from storage
var valuesObjectA = localStorage.getItem('valuesObject');
//console.log('valuesObject: ', JSON.parse(valuesObjectA));
// post your data
$.post( "ajax.php", valuesObjectA ).done(function( data ) {
alert( "Data Loaded: " + data );
}).fail(function() {
console.log( "error" );
});
Sample fiddle
I have a Simple Todo App, that doesn't persist the update of multiple records at once. The client version works properly here: http://codepen.io/anon/pen/bVVpyN
Everytime I make an action, I fire a http request to my laravel API to persist the data. But I have trouble with my completeAll method.
Whenever I click the Complete All button, I execute the completeAll method:
completeAll: function () {
var tasks = [];
var taskIds = [];
this.tasks = this.tasks.filter(function (task) {
task.completed = true;
tasks.push(task);
taskIds.push(task.id);
return task.completed;
});
this.$http.put('api/tasks/' + taskIds, { tasks: tasks });
},
Here I send a request to an URL that might look like this:
http://localhost:8000/api/tasks/1,3,8,4
And together with it, I send the tasks object. This is the request payload from chrome developer tools:
{tasks: [{id: 1, body: "One", completed: true, created_at: "2015-09-09 08:36:38",…},…]}
Then on the serverside I have an update function in my TasksController that looks like this:
public function update(Request $request, $id)
{
// gives ["1", "2", "3"] instead of "1,2,3" a
$taskIds = explode(",", $id);
// what happens if I update more than one task at once
// ** PROBLEMATIC SECTION **
if (count($taskIds) > 1) {
$tasks = Task::whereIn('id', $taskIds)->get()->toArray();
$newTasks = Input::get('tasks');
var_dump($tasks);
var_dump($newTasks);
/** This Line does not Work **/
Task::whereIn('id', $taskIds)->update(Input::get('tasks'));
return 'update more than one task at once';
}
// what happens if I update only one task
Task::where('id', $id)->update(Input::all());
return 'update exactly one task';
}
The above $tasks and $newTasks variable var_dump the same, because I use the toArray() method on the $tasks variable.
But when I try to update the tasks I get this error:
preg_replace(): Parameter mismatch, pattern is a string while replacement is an array
Not sure why. Since in my understanding Task::whereIn('id', $taskIds) is an object and not a string.
Alternative routes that I tried so far
Using a foreach/for loop, to loop through the $tasks array, which I couldn't make to work, because I had also to loop through the $newTasks array to assign them, like this:
foreach ($tasks as $task) {
for ($i=0; $i < count($taskIds); $i++) {
Task::where('id', $task->id)->update($newTasks[$i])
}
}
Creating another endpoint on my API to just complete every task, to an URL like this:
http://localhost:8000/api/tasks/complete-all
And then executing a completeAll method on my TasksController, where I set each task to completed. But I couldn't make it work, because I got a
Method not allowed exception
Not sure why, but I figured it is better to use my original method with an API like this:
api/tasks/1,2,50,1,3
Because it was suggested here: https://laracasts.com/discuss/channels/code-review/vuejs-delete-multiple-tasks-each-task-one-request
Please advice on my original attempt to help me make this simple TodoApp work.
Alternatively, if it is for the better, please help me make the foreach/for loop work. Thanks.
EDIT
According to an answer, it was suggested to follow point 2 from above and create an additional endpoint.
These are my routes:
I use a post method to the url:
http://localhost:8000/api/tasks/complete-all
Executed from my app.js throught this:
this.$http.post('api/tasks/complete-all');
And this is my adapted TasksController:
public function completeAll(Request $request)
{
$tasks = Task::where('completed', false)->get();
foreach ($tasks as $task) {
Task::where('id', $task->id)->update(['completed' => true]);
}
return response()->json(['message' => 'All tasks completed']);
}
The app persists the data now correctly, but it troubles me, that I am not really updating anything from the clientside. I am just executing serverside code, because I am not really passing any query.
How would I solve the problem, using an URL like this:
http://localhost:8000/api/tasks/1,23,43,5
Thanks
Laravel CRUD methods expect on a single resource. As such, sending several resources to update at does not follow this requirement.
While you could update the update method, I would encourage you to create a completeTasks or completeAll method that operates on several resources. You could pass multiple resources in the request body.
To do this, you need to:
Add a Route
Parse the request data for Todo ids (like you for loop)
Call update on the database (Task::whereIn('id', $taskIds)->update())