Breeze Js - Local query of "Lookup Lists" - javascript

In application for performance optimization we using bulk entities loading, like in "Lookup Lists" example http://www.breezejs.com/documentation/lookup-lists.
Query looks like this:
entityQuery.from('SomeBreezeAction')
.using(manager)
.execute()
.then(function (res) {
var set1 = res.results[0].first;
var set2 = res.results[0].second;
}
It working very well remotely. Breeze correctly understand entity types of both entities and generate object based on metadata. In same application we using local Breeze queries for Jasmine tests of client side logic. But query like:
entityQuery.from('SomeBreezeAction')
.using(manager)
.executeLocally()
.then(function (res) {
var set1 = res.results[0].first;
var set2 = res.results[0].second;
}
fails with error: Error: Cannot find an entityType for resourceName: 'SomeBreezeAction'. Consider adding an 'EntityQuery.toType' call to your query or calling the MetadataStore.setEntityTypeForResourceName method to register an entityType for this resourceName.
This is reasonable because we are not using toType or setEntityTypeForResourceName in this query. So my question is it possible to use toType or setEntityTypeForResourceName or something else for such queries which returns more than one entity type in one request?
If its important we define metadata manually and have not direct EF or other DB connection.

The problem is that the client has no idea what "SomeBreezeAction" means. The logic for this operation is entirely contained on the server.
What you can do though is create your own function that performs the same queries locally that "SomeBreezeAction" does on the server and combines the results returned into the same 'shape' that your server side query does. This function can then be called whenever you want a local version of the same query.

Related

Multiple endpoints in same query

It works perfectly, with a single endpoint.
With apollo-link-rest, I have made a client that looks like this
const restLink = new RestLink({ uri: "https://example.com/" })
And export the client with a new ApolloClient({...})
Now to the question
On the same server https://example.com/, there are multiple endpoints, all with same fields but different data in each
The first query that works look like this
export const GET_PRODUCTS = gql`
query firstQuery {
products #rest(type: "data" path: "first/feed") { // the path could be second/feed and it will work with different data
id
title
}
}
`
I want all these different path into one and same json feed, because they all have the same fields, but with different data
Using aliases
You can (should be possible) use standard method to make similar queries - get many data (result) nurmally available as the same shape (node name). This is described here.
{
"1": products(....
"2": products(....
...
}
Paths can be created using variables
Results can be easy combined by iterating over data object. Problem? Only for fixed amount (not many) endpoints as query shouldn't be generated by strings manipulations.
Multiple graphql queries
You can create queries in a loop - parametrized - using Promise.all() and apollo-client client.query(. Results needs to be combined into one, too.
Custom fetch
Using custom fetch you can create a query taking an array of paths. In this case resolver should use Promise.all() on parametrized fetch requests. Combined results can be returned as single node (as required).
Bads
All these methods needs making multiple requests. Problem can be resolved by making server side REST wrapper (docs or blog).

How to create a Shared Query Folder using the vso-node-api (VSTS)?

In the VSTS Rest API, there's a piece of documentation showing me how to create a folder. Specifically, I would like to create a folder within the Shared Queries folder. It seems like I can do this with the REST API.
I would like to do the same thing with the VSTS Node API (vso-node-api). The closest analogous function I can seem to find would be WorkItemTrackingApi.createQuery. Is this the correct function to use?
When I try to use this function, I'm getting an error:
Failed request: (405)
That seems strange, since a "Method Not Allowed" error doesn't seem like the right error here. In other words, I'm not the person deciding what method (GET/POST/...etc) to use, I'm just calling the VSTS Node API's function which should be using the correct HTTP Request Method.
I think the error code would/should be different if something about my request is wrong (like providing bad parameters/data).
But, I would not be surprised if VSTS didn't like the data I provided with the request. I wrote the following test function:
async function createQueryFolder (QueryHeirarchyItem, projectId, query) {
let result = await (WorkItemTrackingApi.createQuery(QueryHeirarchyItem, projectId, query))
return result
}
I set some variables and called the function:
let projectID = properties.project // A previously set project ID that works in other API calls
let QueryHeirarchyItem = {
isFolder: true,
name: 'Test Shared Query Folder 1'
}
try {
let result = await createQueryFolder(QueryHeirarchyFunction, projectID, '')
Notice that I provided a blank string for the query - I have no idea what to provide there when all I want to create is a folder.
So, I think a lot of things could be wrong with my approach here, but also if my request parameters are wrong maybe I should be getting a 400 error? 405 leads me to believe that the VSTS Node API is making a REST call that the underlying VSTS REST API doesn't understand.
For the third parameter of the createQueryFolder, you should specify the folder path where you want to create the new folder.
Such as if you want to create a folder Test Shared Query Folder 1 under Shared Queries, you should specify parameters for createQueryFolder as:
let result = await createQueryFolder(QueryHeirarchyFunction, projectID, 'Shared Queries')

Caching select query data on server side

I am writing an express app, where I'm pushing data from my views to a database. But most of the data is mapped to some other data in database tables.
For example, is a choose student name drop down- once you choose the student by his name , a drop down below - will show all roles that he is allowed for.
So I'm following this pattern of
app.post('\action1', function(req,res){
function querySomething(){
var defered = Q.defer();
connection.query(some_select_query,defered.makeNodeResolver());
return defered.promise;
}
function querySomethingElse(){
var defered = Q.defer();
connection.query(some_other_select_query,defered.makeNodeResolver());
return defered.promise;
}
Q.all([querySomething(), querySomethingElse()]).then((results,err) => {
connection.release()
if(results){
res.render('some_view.ejs', {
result1:results[0][0],
result2:results[1][0]
});
}
else{
res.render('error.ejs',{});
}
})
})
Now the problem is that I have to follow this pattern of selecting something from multiple tables, pass all these function to a promise- and when the results is passed back, goto my view with all those result objects - so that I can use them in my view - as a means of doing drop downs dependent on one another.
Sometimes I have to re-write this multiple times.
Doing a select query like this would be performance intensive especially if all views are using the result of the same query.
Is there any way I can build a cached data store on my express server side code and query that instead of the actual database??
If there is an insert or an update - i will refresh this store and just do a new select * that one time.
What libraries are there on top of express which will help me do this??
Does mysql-cache does the same thing?? I'm also using connection pooling with createPool.
How do I achieve this - or do I just restore to using big mvc's like sails to rewrite my app?
You can try apiCache npm module.
"Sometimes I have to re-write this multiple times."
Based on the business need, you may want to handle each use case separately and this scenario doesn't deal with caching.
Doing a select query like this would be performance intensive especially if all views are using the result of the same query.
This is a classic example for the need of server-side caching.

meteorjs subscribe usage when collection is huge

I don't know the best way to handle huge mongo databases with meteorjs.
In my example I have a database collection with addresses in it with the geo location. (the whole code snippets are just examples)
Example:
{
address : 'Some Street',
geoData : [lat, long]
}
Now I have a form where the user can enter an address to get the geo-data. Very simple. But the problem is, that the collection with the geo data has millions of documents in it.
In Meteor you have to publish a collection on Server side and to subscribe on Client and Server side. So my code is like this:
// Client / Server
Geodata = new Meteor.collection('geodata');
// Server side
Meteor.publish('geodata', function(){
return Geodata.find();
});
// Client / Server
Meteor.subscribe('geodata');
Now a person has filled the form - after this I get the data. After this I search for the right document to return. My method is this:
// Server / Client
Meteor.methods({
getGeoData : function (address) {
return Geodata.find({address : address});
}
});
The result is the right one. And this is still working. But my question is now:
Which is the best way to handle this example with a huge database like in my example ? The problem is that Meteor saves the whole collection in the users cache when I subscribed it. Is there a way to subscribe to just the results I need and when the user reused the form then I can overwrite the subscribe? Or is there another good way to save the performance with huge databases and the way I use it in my example?
Any ideas?
Yes, you can do something like this:
// client
Deps.autorun(function () {
// will re subscribe every the 'center' session changes
Meteor.subscribe("locations", Session.get('center'));
});
// server
Meteor.publish('locations', function (centerPoint) {
// sanitize the input
check(centerPoint, { lat: Number, lng: Number });
// return a limited number of documents, relevant to our app
return Locations.find({ $near: centerPoint, $maxDistance: 500 }, { limit: 50 });
});
Your clients would ask only for some subset of the data at the time. i.e. you don't need the entire collection most of the time, usually you need some specific subset. And you can ask server to keep you up to date only to that particular subset. Bare in mind that more different "publish requests" your clients make, more work there is for your server to do, but that's how it is usually done (here is the simplified version).
Notice how we subscribe in a Deps.autorun block which will resubscribe depending on the center Session variable (which is reactive). So your client can just check out a different subset of data by changing this variable.
When it doesn't make sense to ship your entire collection to the client, you can use methods to retrieve data from the server.
In your case, you can call the getGeoData function when the form is filled out and then display the results after the method returns. Try taking the following steps:
Clearly divide your client and server code into their respective client and server directories if you haven't already.
Remove the geodata subscription on the server (only clients can activate subscriptions).
Remove the geodata publication on the server (assuming this isn't needed anymore).
Define the getGeoData method only on the server. It should return an object, not a cursor so use findOne instead of find.
In your form's submit event, do something like:
Meteor.call('getGeoData', address, function(err, geoData){Session.set('geoDataResult', geoData)});
You can then display the geoDataResult data in your template.

Can't get list of issues with multiple instances of the same parameter

I'm trying to get list of issues from Bit Bucket via REST API with OAuth.js (http://oauth.googlecode.com/svn/code/javascript/). I'm signing every request with
OAuth.completeRequest(message, accessor);
where message is
message: {
action: "https://api.bitbucket.org/1.0/repositories/owner/reponame/issues",
method: "GET",
parameters: p;
};
When p contains parameters with different names, everything is OK:
p = [['status','open'],['priority','high']]
but when p contains parameters with the same name
p = [['status','open'],['status','resolved']]
, server responds 401 UNAUTHORIZED.
Bitbucket API support mutliple instances of the same parameter:
You can query for multiple instances of the same parameter. The system treats multiple instances of the same parameter as an OR for the overall filter query. For example, the following filter looks for open and resolved bugs with the word for in the title:
status=open&kind=!bug&status=resolved&title=~for
I think that problem somewhere in signing methods of the OAuth.js library, but can't find it.
It was a bug on bitbucket side:
https://bitbucket.org/site/master/issue/7009/you-cannot-use-multiple-identical-query

Categories

Resources