OData CRM query: Any open opportunities with a given target? - javascript

Our organization has a CRM installation on which we've done extensive customization. Right now I'm trying to implement a solution to enforce a business rule: Prevent users from updating a program to inactive when the program is a designation opportunity on an open opportunity.
I know how to prevent the update; return false from OnSave() in the JavaScript. I haven't been able to find out when that's the case. The best idea I've come up with is to make a SOAP call to the OData endpoint in CRM, but I've come across a sticking point at the last step. (If you've got a better idea I'm totally open to it.)
Here's what I've got. I can get the program in question:
programset(guid'thisone')
.../OrganizationData.svc/uwkc_programSet(guid'F4D75E9D-3A79-E611-80DA-
C4346BACAAC0')
I can get the associated designations:
programset(guid'thisone')/program-desig
.../OrganizationData.svc/uwkc_programSet(guid'F0D75E9D-3A79-E611-80DA-C4346BACAAC0')/uwkc_uwkc_program_uwkc_opportunitydesignation
and the associated Opportunities to those:
programset(guid'thisone')/program-desig?$expand=desig-opportunity
...OrganizationData.svc/uwkc_programSet(guid'F0D75E9D-3A79-E611-80DA-C4346BACAAC0')/uwkc_uwkc_program_uwkc_opportunitydesignation?$expand=uwkc_opportunity_uwkc_opportunitydesignation
... but now I get a little stuck.
I can filter on a primitive value on the Opportunity (link + field)
...$filter=opp-oppdesig/EstimatedCloseDate gt DateTime('2016-07-01')
...OrganizationData.svc/uwkc_programSet(guid'F0D75E9D-3A79-E611-80DA-C4346BACAAC0')/uwkc_uwkc_program_uwkc_opportunitydesignation?$expand=uwkc_opportunity_uwkc_opportunitydesignation&$filter=uwkc_opportunity_uwkc_opportunitydesignation/EstimatedCloseDate%20gt%20DateTime%272016-07-01%27
and I can filter on a complex value on the Designation (field + value)
...$filter=statecode/Value gt 0
...OrganizationData.svc/uwkc_programSet(guid'F0D75E9D-3A79-E611-80DA-C4346BACAAC0')/uwkc_uwkc_program_uwkc_opportunitydesignation?$expand=uwkc_opportunity_uwkc_opportunitydesignation&$filter=statecode/Value%20gt%200
but I can't make a filter work on a complex value on the Opportunity (connection + field + value)
...$filter=opp-oppdesig/statecode/Value gt 0
...OrganizationData.svc/uwkc_programSet(guid'F0D75E9D-3A79-E611-80DA-C4346BACAAC0')/uwkc_uwkc_program_uwkc_opportunitydesignation?$expand=uwkc_opportunity_uwkc_opportunitydesignation&$filter=uwkc_opportunity_uwkc_opportunitydesignation/statecode/Value%20gt%200
No property 'statecode' exists in type 'Microsoft.Xrm.Sdk.Entity' at
position 45.
How can I filter on the state of an entity two away from what I'm looking at? Or, if there's a better way, what's the best way to prevent in-use programs from being deactivated?

First issue is that you should be using the schema name of the attribute (StateCode), and not the logical name (statecode).
However, I believe it will then just return another error message:
<error xmlns="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata">
<code>-2147220989</code>
<message xml:lang="en-US">attributeName</message>
</error>
For some reason, it seems that filtering on complex types in an expanded entity does not work properly for the SOAP endpoint. And from what I have tested, the new Web API does not support this kind of depth in a query either yet.
One solution to your problem is to just fetch all the results, and then perform the filtering manually in your code. This of course works best if you can assume that there are not too many related entities retrieved in this kind of query. Also be sure to use $select to retrieve only the necessary attributes, as it greatly reduces the time it takes for a query to finish.
Another solution is to perform the query using FetchXML instead. This can be done either via the Web API, or as a SOAP request you construct yourself.
A third solution is to split your query into multiple queries, so you don't have to filter on a state two entities away in a query.

Related

How to ensure data consistency in REST API pagination

I'm building an instant messenger on mobile client that interacts with RESTful API through HTTP requests. The pagination endpoint is quite standard - it has starting location (offset) and number of items in a page (limit). I'm having trouble figuring out how to ensure 100% data consistency with pagination when the database can rapidly change.
For example, with some dozen participants, there could be a dozen new messages in a conversation within a second. I don't think it's far-fetched to guess that some of those messages can alter the database within the time the HTTP request for pagination comes back from the server. Fortunately, since this is a messenger I do not have to consider the possibility of data deletion and consider only the data addition.
Among my research, following two links were quite helpful but didn't provide clear solution:
How to ensure data integrity in paginated REST API?
How to implement robust pagination with a RESTful API when the resultset can change?
The only potential solution I can come up with is using the timestamp of the last object in the previously fetched page. So the HTTP query would have timestamp as a parameter, and the server would return a page of objects created after that timestamp.
Is there any potential problem I'm not seeing, or even better, a much better solution to this issue?
It seems that the method I've thought of has a name - cursor based pagination.
The link below has a great graphical description and explanation, plus an example in php.
http://www.sitepoint.com/paginating-real-time-data-cursor-based-pagination/
There's also a helpful guide from Django Framework that compares two different pagination techniques (LimitOffsetPagination and CursorPagination).
http://www.django-rest-framework.org/api-guide/pagination/
Cursor based pagination requires a unique, unchanging ordering of items. Facebook and Twitter use some generated IDs. As for me, I've decided to simply use timestamp at object creation, as it supports up to milliseconds precision. That should be good enough for now.

Run Database Stored RegEx against DOM

I have a question about how to approach a certain scenario before I get halfway through it and figure out it was not the best option.
I work for a large company that has a team that creates tools for the team mates to use that aren’t official enterprise tools. We have no access to the database directly, just access to an internal server to store our files to run and be able to access the main site with javascript etc (same domain).
What I am working on is a tool that has a ton of options in it that allow you to select that I will call “data points” on a page.
There are things like “Account status, Balance, Name, Phone number, email etc” and have it save those to an excel sheet.
So you input account numbers, choose what you need and then using IE Objects it navigates to the page and scrapes data you request.
My question is as follows..
I want to make the scraping part pretty Dynamic in the way it works. I want to be able to add new datapoints on the fly.
My goal or idea is so store the regular expression needed to get the specific piece of data in the table with the “data point option”.
If I choose “Name” it knows the expression for name in the database to run again the DOM.
What would be the best way about creating that type of function in Javascript / Jquery?
I need to pass a Regex to a function, have it run against the DOM and then return the result.
I have a feeling that there will be things that require more than 1 step to get the information etc.
I am just trying to think of the best way to approach it without having to hardcode 200+ expressions into the file as the page may get updated and need to be changed.
Any ideas?
IRobotSoft scraper may be the tool you are looking for. Check this forum and see if questions are similar to what you are doing: http://irobotsoft.org/bb/YaBB.pl?board=newcomer. It is free.
What it uses is not regular expression but a language called HTQL, which may be more suitable for extracting web pages. It also supports regular expression, but not as the main language.
It organizes all your actions well with a visual interface, so you can dynamically compose actions or tasks for changing needs.

How to make SELECT query work when the table schema is upgraded with a new column?

I'm working on a JavaScript project which has an existing database (and hence an existing schema).
To add a new feature I must add a new column to the table. This will store a property of objects of a class.
Currently there is a SELECT query in the project which queries a bunch of fields from the database table every time the application starts and then puts these obtained results into the various needed JavaScript objects.
So, now it looks like:
let FIELDS = "field1, field2, field3";
let query = "SELECT " + FIELDS + "FROM FOO_TABLE";
Somewhere else, this query is made whenever needed (usually after app restart).
I thought of changing it to:
let FIELDS = "field1, field2, field3, new_prop";
But it won't work as in the current table, such a table doesn't exist. (Maybe after next restart, things will work, but not the first time).
What is the workaround?
Also, please note that a silent change will be better than one that will show that this property is new to everyone who works on the file.
Well you can't just add the field into the query, that won't work, so you have 2 options.
Use exceptions/errors to work out if something goes wrong. I'm not an sqlite guru (in fact i've never used sqlite directly, always via API's as with ios dev.), but if you try to run a query containing a column name that is non-existent, something will error which you can capture, work out the error code/string and decide to re-run the query minus the bad column.
This is not a good idea, its dirty and expensive. I've used exceptions to save dupe check query's in the past, but even that isn't the correct way to do things... but it does work
The only "proper" way to do what you want is to first check if the column exists. In sqlite, i don't think there's a simply select command like there is in mysql. But you can use the PRAGMA table_info('table-name') statement to get all columns, then check if your column exists prioir to running the query. This is the correct way to do things.
Having said that, if your working in a collaborative/team dev environment, you should have a much cleaner way of upgrading everyone in the group. So i'd be more tempted to address that issue of procedure rather than code my way around it.

Running a script/query on all documents in a couchdb database/view

I recently discovered CouchDB and it fits perfectly for what I am working at today. Working with the Futon interface, and calling the http API works fin, but something is missing.
During the design of my application, I sometimes want to apply some changes on all the documents in the database. As a simplified example, lets say all my documents have a field named "type", and I decided to pick strings as type instead of numbers.
Now I have to go over all my documents in Futon, and change the string into a number, which is a silly job.
Another example would be deleting all documents that apply to a certain condition.
The perfect solution would be some kind of engine which can call a javascript function per document, and I can return the new value for the document.
Does this exist?
Everything you need is already there: the API. Note that Futon also only a wrapper around the API.
Not sure what middleware you use (node?, php?) but if you're familiar with the API it should be easy to:
Get all documents
Change field "type" (string instead of number)
Save the docs (use _bulk_docs)

Best Practices for displaying large lists

Are there any best practices for returning large lists of orders to users?
Let me try to outline the problem we are trying to solve. We have a list of customers that have 1-5,000+ orders associated to each. We pull these orders directly from the database and present them to the user is a paginated grid. The view we have is a very simple "select columns from orders" which worked fine when we were first starting but as we are growing, it's causing performance/contention problems. Seems like there are a million and one ways to skin this cat (return only a page worth of data, only return the last 6 months of data, etc.) but like I said before just wondering if there are any resources out there that provide a little more hand holding on how to solve this problem.
We use SQL Server as our transaction database and select the data out in XML format. We then use a mixture of XSLT and Javascript to create our grid. We aren't married to the presentation solution but are married to the database solution.
My experience.
Always set default values in the UI for the user that are reasonable. You don't want them clicking "Retrieve" and getting everything.
Set a limit to the number of records that can be returned.
Only return from the database the records you are going to display.
If forward/backward consistencency is important, store the entire results set from the query in a temp table and return just the page you need to display. When paging up/down retrieve the next set from the temp table.
Make sure your indexs are covering your queries.
Use different queries for different purposes. Think "Open Orders" vs "Closed Orders". These might perfrom much better as different queries instead of one generic query.
Set parameter defualts in the stored procedures. Protect your query from a UI that is not setting reasonable limits.
I wish we did all these things.
I'd recommend doing some profiling to find the actual bottlenecks. Perhaps you have access to Visual Studio Profiler? http://msdn.microsoft.com/en-us/magazine/cc337887.aspx There are plenty of good profilers out there.
Otherwise, my first stop would be pagination to bring back less records from the db, which is easier on the connection and the memory footprint. Take a look at this (I'm assuming you're on SQL Server >= 2005)
http://www.15seconds.com/issue/070628.htm
I"m not sure from the question exactly what UI problem you are trying to solve.
If it's that the customer can't work with a table that is just one big amorphous blob, then let him sort on the fields: order date, order number, your SKU number, his SKU number maybe, and I guess others,too. He might find it handy to do a multi-column stable sort, too.
If it's that the table headers scroll up and disappears when he scrolls down through his orders, that's more difficult. Read the SO discussion to see if the method there gives a solution you can use.
There is also a JQuery mechanism for keeping the header within the viewport.
HTH
EDIT: plus I'll second #Iain 's answer: do some profiling.
Another EDIT: #Scott Bruns 's answer reminded me that when we started designing the UI, the biggest issue by far was limiting the number of records the user had to look at. So, yes I agree with Scott that you should give the user some way to see only a limited number of records right from the start; that is, before he ever sees a table, he has told you a lot about what he wants to see.
Stupid question, but have you asked the users of your application for input on what records that they would like to see initially?

Categories

Resources