Jsplumb add connection programmatically using endpoints - javascript

My requirement is like, I am adding two endpoints using jsplumb.addEndPoint for two containers named 'container0' and 'container1'.
Now I need to link the two end points using a connector programmatically but the jsplumb.connect creates a new endpoint and connecting and is not using the end point which I have created using jsplumb.addEndpoint .
How could I connect these two end points? Also I just want to add a connection if the connection is not already there for the end points?

To connect using already existing endpoints you can make use of the Uuid's of the endpoints:
jsPlumb.ready(function () {
var e0 = jsPlumb.addEndpoint("container0",{uuid:"ep1"}), //set your own uuid for endpoint for later access.
e1 = jsPlumb.addEndpoint("container1",{uuid:"ep2"});
jsPlumb.connect({ uuids:[e1.getUuid(),e2.getUudi()] }); // (or) jsPlumb.connect({ uuids:["ep1","ep2"] });
});

According to the API jsPlumb.connect() can receive
array of UUIDs of the two Endpoints
but it's not the only way to connect endpoints as there's also another way (which is more common as it's used to connect two objects) with source and target parameters who can receive String or Object (as a DOM object) or directly an Endpoint (which is not the same as a DOM selector of an endpoint, for instance it can be let firstEndpoint = jsPlumb.addEndpoint()).
So if, like me, you don't want to use universally unique identifier you can stick to classical source target and give endpoint as parameters.
jsPlumb.connect({ source: firstEndpoint, target: jsPlumb.selectEndpoints() });

Related

Multiple endpoints in same query

It works perfectly, with a single endpoint.
With apollo-link-rest, I have made a client that looks like this
const restLink = new RestLink({ uri: "https://example.com/" })
And export the client with a new ApolloClient({...})
Now to the question
On the same server https://example.com/, there are multiple endpoints, all with same fields but different data in each
The first query that works look like this
export const GET_PRODUCTS = gql`
query firstQuery {
products #rest(type: "data" path: "first/feed") { // the path could be second/feed and it will work with different data
id
title
}
}
`
I want all these different path into one and same json feed, because they all have the same fields, but with different data
Using aliases
You can (should be possible) use standard method to make similar queries - get many data (result) nurmally available as the same shape (node name). This is described here.
{
"1": products(....
"2": products(....
...
}
Paths can be created using variables
Results can be easy combined by iterating over data object. Problem? Only for fixed amount (not many) endpoints as query shouldn't be generated by strings manipulations.
Multiple graphql queries
You can create queries in a loop - parametrized - using Promise.all() and apollo-client client.query(. Results needs to be combined into one, too.
Custom fetch
Using custom fetch you can create a query taking an array of paths. In this case resolver should use Promise.all() on parametrized fetch requests. Combined results can be returned as single node (as required).
Bads
All these methods needs making multiple requests. Problem can be resolved by making server side REST wrapper (docs or blog).

Node.js and express Rest api to create custom fields routes

I am working with node.js and express framework for my rest api server.
I have created get routes with query params and working with them.
But I want to make a functionality like facebook graph api where I can send fields with my api routes such as
/me?fields=address,birthday,email,domains.limit(10){id,name,url}
I am thinking of getting the fields from the query parameters and then splitting them based on , such as
const fields = req.query.fields;
let fieldsArray = fields.split(',');
But how can I pass the sub object attributes and retrieve them from the route like domain field from the above example.
/me?fields=address,birthday,email,domains.limit(10){id,name,url}
If you use a dot-notation like so
/me?fields=address,domains.id,domains.name&dbQueryFilters=domains.limit=10
It could mean:
I want these fields in my response:
address
domains:
id
name
Use these to query the db:
domains:
limit :10
You can add/remove such query variables until it explicitly conveys what your API does while still being very basic. It's always in your best interest to keep things simple and basic.
On the nodeJS side, you can use a library like flat to unflatten the query object:
var fields = flat.unflatten(req.query.fields)
Try sending out the request like:
/me?fields=address,birthday,email,domains.limit%2810%29%7Bid%2Cname%2Curl%7D
There is code for every special character you can get it by doing:
escape("domains.limit(10){id,name,url}") // Returns domains.limit%2810%29%7Bid%2Cname%2Curl%7D
More details: JavaScript escape() Function
Hope this solves your issue.

Mirth channelMap in source JavaScript

In my source connector, I'm using javascript for my database work due to my requirements and parameters.
The end result is storing the data.
ifxResults = ifxConn.executeCachedQuery(ifxQuery); //var is declared
I need to use these results in the destination transformer.
I have tried channelMap.put("results", ifxResults);.
I get the following error ReferenceError: "channelMap" is not defined.
I have also tried to use return ifxResults but I'm not sure how to access this in the destination transformer.
Do you want to send each row as a separate message through your channel? If so, sounds like you want to use the Database Reader in JavaScript mode. Just return that ResultSet (it's really a CachedRowSet if you use executeCachedQuery like that) and the channel will handle the rest, dispatching an XML representation of each row as discrete messages.
If you want to send all rows in the result set aggregated into a single message, that will be possible with the Database Reader very soon: MIRTH-2337
Mirth Connect 3.5 will be released next week so you can take advantage of it then. But if you can't wait or don't want to upgrade then you can still do this with a JavaScript Reader:
var processor = new org.apache.commons.dbutils.BasicRowProcessor();
var results = new com.mirth.connect.donkey.util.DonkeyElement('<results/>');
while (ifxResults.next()) {
var result = results.addChildElement('result');
for (var entries = processor.toMap(ifxResults).entrySet().iterator(); entries.hasNext();) {
var entry = entries.next();
result.addChildElement(entry.getKey(), java.lang.String.valueOf(entry.getValue()));
}
}
return results.toXml();
I know this question is kind of old, but here's an answer just for the record.
For this answer, I'm assuming that you are using a Source connector type of JavaScript Reader, and that you're trying to use channelMap in the JavaScript Reader Settings editing pane.
The problem is that the channelMap variable isn't available in this part of the channel. It's only available in filters and transformers.
It's possible that what you want can be accomplished by using the globalChannelMap variable, e.g.
globalChannelMap.put("results", ifxResults);
I usually need to do this when I'm processing one record at a time and need to pass some setting to the destination channel. If you do it like I've done in the past, then you would first create a globalChannelMap key/value in the source channel's transformer:
globalchannelMap.put("ProcID","TestValue");
Then go to the Destinations tab and select your destination channel to make sure you're sending it to the destination (I've never tried this for channels with multiple destinations, so I'm not sure if anything different needs to be done).
Destination tab of source channel
Notice that ProcID is now listed in the Destination Mappings box. Click the New button next to the Map Variable box and you'll see Variable 1 appear. Double click on that and put in your mapping key, which in this case is ProcID.
Now go to your destination channel's source transformer. There you would enter the following code:
var SentValue = sourceMap.get("ProcID");
Now SentValue in your destination transformer has whatever was in ProcID when your source channel relinquished control.

DocumentDB: Access document by database name, collection ID and document ID in Node.js

I'm making my first application using DocumentDB. I'm developing an API for it in Node.js. As others have noted, the DocumentDB APIs are very confusing and appear to require convoluted code to achieve simple things.
My API will allow me to access data in the database with a URL of the form http://<host>/data/<databaseName>/<collectionID>/<documentId>/<pathToData>. If <pathToData> is empty, then I will get the whole document as a JSON object.
I want a function with the signature GetDocument(databaseName,collectionID,documentId,callback), where callback is a function that takes the particular document as a JavaScript object. What implementation of GetFunction achieves my goal?
The DoQmentDB library makes for a trivial solution.
// dbClient: require('documentdb').DocumentClient; new DocumentClient(host,options);
// callback: function(document)
function getDocument(dbClient,databaseId,collectionId,documentId,callback) {
var DoQmentDB = require('doqmentdb');
var db = new DoQmentDB(dbClient,databaseId);
var collection = db.use(collectionId);
collection.findById(documentId).then(callback);
}
You first need your method to initialize a documentclient object with the database and collection parameters, which you do with the readorcreatedatabase and readorcreatecollection methods, as showcased in the documentation. Once you have that object initialized, you can query specific objects by document id or by a custom query string.
Ideally, you should cache those database and collection objects upon first request, so that you don't hit the db asking for the same information upon every single request you issue

Can't get list of issues with multiple instances of the same parameter

I'm trying to get list of issues from Bit Bucket via REST API with OAuth.js (http://oauth.googlecode.com/svn/code/javascript/). I'm signing every request with
OAuth.completeRequest(message, accessor);
where message is
message: {
action: "https://api.bitbucket.org/1.0/repositories/owner/reponame/issues",
method: "GET",
parameters: p;
};
When p contains parameters with different names, everything is OK:
p = [['status','open'],['priority','high']]
but when p contains parameters with the same name
p = [['status','open'],['status','resolved']]
, server responds 401 UNAUTHORIZED.
Bitbucket API support mutliple instances of the same parameter:
You can query for multiple instances of the same parameter. The system treats multiple instances of the same parameter as an OR for the overall filter query. For example, the following filter looks for open and resolved bugs with the word for in the title:
status=open&kind=!bug&status=resolved&title=~for
I think that problem somewhere in signing methods of the OAuth.js library, but can't find it.
It was a bug on bitbucket side:
https://bitbucket.org/site/master/issue/7009/you-cannot-use-multiple-identical-query

Categories

Resources