Updating Silverlight with data. JSON or WCF? - javascript

We will be using custom Silverlight 4.0 controls on our ASP.NET MVC web page to display data from our database and was wondering what the most efficient method was? We will be having returned values of up to 100k records (of 2 properties per record).
We have a test that uses the HTML Bridge from Javascript to Silverlight. First we perform a post request to a controller action in the MVC web app and return JSON. This JSON is then passed to the Silverlight where it is parsed and the UI updated. This seems to be rather slow, with the stored procedure (the select) taking about 3 seconds and the entire update in the browser about 10-15sec.
Having a brief look on the net, it seems that WCF is another option, but not having used it, I wasn't sure of it's capability or suitability.
Does anyone have any experiences or recommendations?

You should definitely consider a change in your approach. This just shouldn't have to be so complicated. WCF is a possible solution. I am sure you are gonna get better performance out of it.
It is designed to transfer data across the wire. Web services in general are thought to be the "right way" to provide data to your silverlight app. WCF services are definitely more configurable.
Another point in favour of web services is that this approach is more straightforward than the one you apply. You don't have to serialize in JSON then to parse in JavaScript objects and then to pass them to Silverlight.
It is really easy to port and continue developing with wcf.
Last but not least your code will be much more readable and maintainable.
It seems that performance is critical in your case so you can take a look here for compraison.
In conclusion my advice is to consider a change in your approach. WCF services looks like possible solution.
Hope this helps.

Related

How to consume a HATEOAS REST API in Angular?

I'm working on an Angular 4 front-end for an API built by another team. The API follows HATEOAS and provides me with hypermedia links with every single response.
I know the shape of the API and I figure I can just hard-code the URLs into Angular Services with minimal fuss. However, a colleague (who is a backend developer) is trying to convince me that I should take full advantage of the hypermedia because it will mean less coupling between the frontend and backend (and potential breakage if the API changes).
However, I'm stumped on how I'd even go about implementing a simple HATEOAS pattern using Angular's built-in Http service. How would I store/share the hypermedia/URL information in a way that doesn't couple all the services together and make them hard-to-test? There seems to be no examples out there.
Would trying to create a HATEOAS-friendly HTTP client even be a good idea, or is it likely not worth the trouble?
Your colleague is right, you should use the meta information that the back-end provides. In this way you are not putting responsibility on the client that doesn't belong there. Why should the client know from where to fetch the entities? Storing the entities (in fact the data in general) is the responsibility of the back-end. The back-end owns the data, it decides where to put it, how to access it, when to change the location or the persistence type, anything related to storing the data.
How would I store/share the hypermedia/URL information in a way that doesn't couple all the services together and make them hard-to-test?
Why do you think using HATEOAS makes the testing harder? It does not, in fact not using it makes the testing harder as the URLs are static which makes the back-end non-stub-able.
You can extract the information from the back-end response and store it as meta-information in the angular model, on a _meta key or something like that.

Modifying an existing duplex WCF service to use with JavaScript

I've got an existing WCF service that we've been using to communicate with a Silverlight client, and hence have been using it with the NetTCP binding. I'd like to start using this same service with a JavaScript client, ideally modifying the service as little as possible (i.e., allowing Silverlight and JS clients to call the same duplex service). Ideally this would happen through a reasonably performant and scalable tech, like WebSockets, rather than a hack like Comet.
What's the best way to do this?
Adding WebSocket support to the service (through the NetHttpBinding) would seem like one obvious way to do it - but there doesn't seem to be any documentation on how to call the resulting service from JavaScript. I suppose I could configure it to use a text-based transport, instead of the default binary transport, and then hack together some sort of JavaScript-based SOAP client (perhaps using WSDL2JS) to call it. That feels like it ought to work, but also pretty awkward, and with some pieces in the mix that haven't been well documented.
I could also re-implement my service in a framework like XSockets or SuperWebSocket, but that's some real work, and keeping it in sync with the WCF implementation would be more on top of that.
Any other thoughts?
I am one of the guys behind XSockets.NET. I can help you with this one I hope.
XSockets has a "external" API that you can use from WCF (or anything else talking TCP/IP and .NET) to send messages to the XSockets server. The server will then pass the messages (pub/sub pattern) to the client(s) and vice versa.
So, there will be almost no changes to your WCF.
Just tell me if you need an example, and I will provide one for you. Just send me an email on uffe at xsockets dot net and we can take it from there.
EDIT: Created a example on howto boost your WCF to realtime. It´s on GitHub: Boost WCF to RealTime
Regards
Uffe, Team XSockets

why should I use backbone.js or spine.js

I'm developing a mostly informational public facing website. My architecture is to deliver JSON data to the client for pages in the site. I plan on caching the JSON in localStorage on client and let it persist there for XX amount of time before it refreshes. I'm using client side templates (jsRender) for rendering JSON into UI widgets that are then pushed into view using jQuery.
In my research for this, I stumbled upon JavaScript MVC approaches like backbone.js and spine.js among others. I've read through them and am comparing them to my approach above and am not sure if/why I would need something like backbone.js or spine.js. I'm not doing hardly any data entry except having users fill out contact us form or sign up for our newsletter. So, really no need to keep view and model in sync. I'm just retrieving JSON from my server and rendering it using templates and caching JSON for a period of time in localStorage.
I want to check with the experts out there if my approach seems appropriate and to see if I really "need" backbone.js or spine.js. How would any of these approaches help with my architectural direction?
If you feel you don't need anything else, I would suggest not to use it. "Premature optimization is a root of all evil". When you will run into trouble because your application becomes messy and you spend a lot of time implementing new features or solving bugs then all this stuff will start to make sense to you. Then you will learn why it's very convinient and elegant to implement MVC in your app from the very begining.

Do I have to use a Backend when using Backbone.js?

I want to develop a relatively simple application that calculates some value based on several inputs. I dont want a backend, all the calculation can be done in the browser.
Im a little new to JavaScript and WebApps and I came across Backbone.js.
I really like the MVC design, however, they mention a backend a lot. My question:
Is a backend server absolutely required?
Is a backend server optional but without one there isn't much point in backbone.
Or will backbone will really help me out?
Backend is not required.
Backbone can fully work without any backend if your application doesn't require one.
That depends on your application. If you want to retrieve value of some inputs and calculate a result then Backbone won't do that for you - it will help you structure your code. If you app is simple and don't need support for models, views and collections or routing, then there is no point in using Backbone. Hard to answer this question.
For example: Classic todo example application doesn't use any backend.
Backbone.js implements fetch(), save(), destroy() etc. methods on models automatically performing appropriate AJAX requests and parsing response. So it has a strong support for backend via REST services, but it is optional.
You can still use models, views, routers and events without any server-side code. Just don't call REST methods (or override them at your wish).
You can use localStorage for persistence (you'd have to implement this yourself or find it on the web, like here) but if you don't even need that then you don't need to use any of the persistence methods in backbone.
Backbone is meant to help you structure a medium-large sized application (js-wise), so it doesn't become unmaintainable jQuery spaghetti. With short applications (js-wise) it's really an overkill unless you are trying to learn how backbone works.
Note with js-wise I mean the client side code, if you had a huge backend but the only js would be something that focuses some form, it would not even count as a short application (js-wise).
You can use backbone.js without a backend. However you obviously won't be able to store or retrieve data. Backbone may still be useful for keeping your code organized, however it really shines when you want to separate presentation logic from logic that manipulates your data, which is a goal of the MVC pattern. Generally your data will be stored on and retrieved from a backend.
If you want to play around with data persistence, try out backlift.com. [disclosure, I work on backlift.com] We've tried to make it easy to get a backbone app up-and-running without having to setup a server or deal with compiling templates.

Which one is better for jQuery.ajax calls? .Net Web-Service or an .ashx?

I ahve been practicing jQuery.ajax() recently. I have started to learn to call .Net web-services qith jQuery.ajax().
Now I am considering if I will use only jQuery.ajax() calls to some service methods on the server; is it still meaningfull to have .Net Web-Services or I should go with .ashx handlers instead?
Thanks!
Two quotes from the ASP.NET forums:
Unless it's an extremely high load situation, you'll find that all
three perform nearly identically. The performance of your code inside
the handler/service is going to be the limiting factor.
For simple AJAX calls that are only intended to be exposed to the
browser, I don't think WCF justifies its added complexity. It's great
for some things, but I have a hard time recommending it for this.
Between ASMX and HttpHandler, I go with ASMX every time. An
HttpHandler is probably negligibly faster, but an ASMX "ScriptService"
makes JSON serialization and deserialization of complex types
transparent, which is immensely useful.
Here's another option:
If you have some methods you want to run (and you like JQuery)... I
suggest looking at this:
http://encosia.com/2008/05/29/using-jquery-to-directly-call-aspnet-ajax-page-methods/
and related articles. Works beautiful. Very efficient as far as
bandwith goes. They also have an article on querying .asmx services.
There is no messing around with the bloated size of ASP.NET's innate
AJAX. Since AJAX out of the box can be very bloated. Plus it's very
easy.
You should also have a look at microsoft's most recent framework for web services: WebAPI
Personally, I recently switched to ServiceStack.NET for my web services, and I find it's a lot easier and elegant (than WebAPI, or WCF).
a WCF service would be the preferred method over ASP.NET soap web services or ashx services .

Categories

Resources