Ember 2: extract model data - javascript

Is there any way to extract Model data from existing Ember app (ember version >= 2.10) without any changes in app's sources.
For example I want to have some Selenium test for my UI based on Ember. And some of my initialization code depends on Models in Ember. Can I extract this models via some pretty JS script?

You cannot access the store from outside its namespace. Meaning if you dont have access to an Ember container you will not be able to look up the store.
You would have to modify the source code to do something hacky like setting the main App store as a global property (not recommended as it can lead to memory leaks) and accessing that global store with your test suite.
Recommended: rely on Embers well thought Acceptance tests:
https://guides.emberjs.com/v2.11.0/testing/acceptance/
If you did have access to the App instance you could simply:
var store = App.__container__.lookup('store:main');
var post = this.store.peekRecord('post', 1); // => no network request

Related

How to share a single instance of an object among multiple actions?

I have a React/Redux application that talks alot to an API and deals with a lot of rarely changing data from a DB. In order to reduce traffic and improve UE, I now want to create a caching mechanism that stores data on the client by automatically using the best technology that is available (descending from IndexedDB to LocalStorage etc.).
I created a cache object that does an initial check which determines the storage mechanism (which gets saved to an engine property, so the check just needs to run once). It also has some basic methods save(key, value) and load(key), which then call the appropriate functions for the initially determined mechanism.
The cache object and its methods do work, but I wonder how to create the cache in my main index.js only once when the application loads, and then use this very object in my actions without recreating another cache object every time?
BTW: It feels wrong to make the cache part of my application state as it does not really contain substantial data to run the application (if there is no caching available, it falls back to just calling the API).
Do I need to inject the cache into my actions somehow? Or do I need to create a global/static cache object in the main window object?
Thanks for clarification and thoughts on this issue.
redux-thunk middleware offers a custom argument injection feature you could use.
When creating the store
const cache = createCache()
const store = createStore(
reducer,
applyMiddleware(thunk.withExtraArgument(cache))
)
Then in your action creator
function getValue(id) {
return (dispatch, getState, cache) => {
// use cache
}
}

Persistent object in node.js

I am fairly new to node and backend work in general. We are using express at work for a pretty large monolith application that houses all of our endpoints and services. My task is to grab a property that comes in on the request object (i.e. request.someObject), and use it in several services. In a DOM environment I would use something like localStorage or sessionStorage to store that data, and then reuse it where needed across the application. Im going to try and explain a little further with code:
Endpoint
router.route('/:cartId/generatePaymentLink')
.post(jsonParser, urlParser, function(request, response, next) {
var theObject = request.someObject;
// need to pass theObject to several services
});
Services are stored within separate files, here is an example of one
var paypalInfo = new Paypal({
userId: theObject.user_id,
password: theObject.password
});
I can pass it through parameters to the services that use data, but its used in several different places and would have to be defined in those individual services. Is there a way to create a persistent object/config file that I can just import and have that data, or something like sessionStorage in the DOM?
EDIT: Not a database, looking for another solution. For performance reasons we are avoiding this

Angular JS Service Architecture

EDIT
The short version:
Say I have application data is many different services. How do I get around needing to inject all of those services into every controller that displays application state?
EDIT
I am building my first Angular application. The basic design is I have a home page that shows the value of about 5 different variables (which are each pretty complicated). While on this page the app is collecting and analyzing data from bluetooth. Occasionally, the these 5 variables and some bluetooth data are saved to a REST back end and also saved to the device. There are pages for each of these 5 variables to change their value.
I have done my best to follow best practices. I have very thin controllers. I use services for all my data. I really only use $scope for binding data between views and controllers.
My issue now is that I started with a global "State" service to keep track of those 5 variables. I inject into any controller that needs to display state, and bind the html to it. Any time I want to change any state, I call a method of that State service to do it. This worked well, but now that State service is getting huge.
I have tried to break functions out to other services, but I run into the issue of needing to read data from the State service, then writing back to other properties of the State service. If I inject the other service into State, I can't inject State into the other service too.
I have thought about how I could have many smaller services, but I keep coming back to when I save the data to the server. When I do that I need to gather up data from every corner of the application to send up. If all this information is stored in different services, I am left with injecting all of them into a single service once again.
As I write this, I am pretty sure I am missing a big concept with using $scope across an application.
Any pointers would be appreciated,
Thanks,
Scott
Could you divide things into sub-services, and then make the State service an aggregator for these sub-services, then instead of injecting State into the sub-services, you inject the specific sub-service that you need? E.g.:
var app = angular.module('services', []);
app.service('sub1', function(){
return {
// ...
}
});
app.service('sub2', function(sub1){
var data = sub1.getData();
data.prop = 'new_value';
sub1.setData(data);
return {
// ...
}
});
app.service('State', function(sub1, sub2){
var data = sub1.getData();
data.prop = 'new_value';
sub1.setData(data);
var data = sub2.getData();
data.prop = 'new_value';
sub2.setData(data);
return {
// ...
}
});
Looks like you need Redux to help you manage your application state
https://github.com/wbuchwalter/ng-redux

Singe Page Application External Configurations (Not On NodeJS)

I'm looking for either a reference or an answer to what I think is a very common problem that people who are current implementing JavaScript MVC frameworks (such as Angular, Ember or Backbone) would come across.
I am looking for a way or common pattern to externalize application properties that are accessible in the JS realm. Something that would allow the javascript to load server side properties such as endpoints, salts, etc. that are external to the application root. The issue that I'm coming across is that browsers do not typically have access to the file systems because it is a security concerns.
Therefore, what is the recommended approach for loading properties that are configurable outside of a deployable artifact if such a thing exists?
If not, what is currently being used or is in practice that is considered the recommended approach for this types of problem?
I am looking for a cross compatible answer (Google Chrome is awesome, I agree).
Data Driven Local Storage Pattern
Just came up with that!!
The idea is to load the configuration properties based on a naming over convention configuration where all properties are derived from the targeted hostname. That is, the hostname will derive a trusted endpoint and that endpoint will load the corresponding properties to the application. These application properties will contain information that is relative at runtime. The runtime information will be supplied to the integration parts which then communicate via property iteration on the bootstrapping start up.
To keep it simple, we'll just use two properties here:
This implementation is Ember JS specific but the general idea should be portable
I am currently narrowing the scope of this question to a specific technological perspective, that is Ember JS with the following remedy that is working properly for me and hope it will help any of you out there dealing with the same issue.
Ember.Application.initializer implementation in start up
initialize: function (container, application) {
var origin = window.location.origin;
var host = window.location.hostname;
var port = window.location.port;
var configurationEndPoint = '';
//local mode
if(host === 'localhost'){
//standalone using api stub on NODEJS
if(port === '8000'){
configurationEndPoint = '/api/local';
}//standalone UI app integrating with back end application on same machine, different port
else{
configurationEndPoint = '/services/env';
}
origin += configurationEndPoint;
}else{
throw Error('Unsupported Environment!!');
}
//load the configuration from a trusted resource and store it in local storage on start up
$.get(origin,
function( data ) {
//load all configurations as key value pairs and store in localStorage for access.
configuration = data.configuration;
for(var config in configuration){
debugger;
var objectProperty = localStorage + '.' + config.toString()
objectProperty = configuration[config];
}
}
);
}
Configurable Adapter
export default DS.RESTAdapter.extend({
host: localStorage.host,
namespace: localStorage.namespace
});
No later than yesterday morning i was tackling the same issue.
Basically, you have two options:
Use localStorage/indexedDB or any other client-side persistent storage. (But you have to put config there somehow).
Render your main template (the one that gets rendered always) with a hidden where you put config JSON.
Then in your app init code you get this config and use it. Plain and simple in theory, but lets get down to nasty practice (for second option).
First, client should get config before application loads. It is not easy sometimes. e.g. user should be logged in to see config. In my case i check if i can provide config on the first request, and if not redirect user to login page. This leads us to second limitation. Once you are ready to provide config, you have to reboot app completely so that configuration code run again (at least in Angular it is necessary, as you cannot access providers after the app bootstraps).
Another constraint, the second option is useless if you use static html and cannot change it somehow on server before sending to the client.
May be a better option would be to combine both variants. This should solve some problems for returning users, but first interaction will not be very pleasant anyway. I have not tried this yet.

Single record persistence with ember-data

In Ember.js with ember-data (using the 1.0pre versions) all changes to the data are saved into a defaultTransaction on the store. When the store is committed with store.commit() ALL changes to the data are saved back to the API (using the RESTAdapter).
I would like more control over objects being persisted. So for now, I have been getting instances of store and adapter, then calling something like adapter.createRecord(store, type, record) or updateRecord where type is the App.Person model and record is an instance of that model.
This is using internal bits of the DS.RESTAdapter that I don't think are meant to be used directly. While it works I'm hoping there is a better way to gain more control over persistence then store.commit(). The business logic and UX of my application require finer control.
transaction = router.get('store').transaction();
person = transaction.createRecord(App.Person);
person.set('name', 'Thanatos');
transaction.commit();
watch yehuda presentation regarding this.
http://www.cloudee.com/preview/collection/4fdfec8517ee3d671800001d

Categories

Resources