how to store data in local browser & retrieve back from it - javascript

i have json data below, i want to store those data in my browser & finally i want to get back those data from my browser if user request it from a textbox. How to do this stuffs?
Actually, i am a server side programmer, this is my second javascript/jquery demo example. I am basically trying to learn these stuffs with the help of creating demo. Please help me to learn.
i have jason data obtained by calling remote websites(eg. www.google.com/finance/....)
{
"list": {
"meta": {
"type": "resource-list",
"start": 0,
"count": 168
},
"resources": [{
"resource": {
"classname": "Quote",
"fields": {
"name": "USD/KRW",
"price": "1062.280029",
"symbol": "KRW=X",
"ts": "1396294510",
"type": "currency",
"utctime": "2014-03-31T19:35:10+0000",
"volume": "0"
}
}
}, {
"resource": {
"classname": "Quote",
"fields": {
"name": "SILVER 1 OZ 999 NY",
"price": "0.050674",
"symbol": "XAG=X",
"ts": "1396287757",
"type": "currency",
"utctime": "2014-03-31T17:42:37+0000",
"volume": "217"
}
}
}
]
}
}

Using jQuery and Localstorage you could do:
Set item:
localStorage.setItem('myJSON',yourJSONString);
Remove item:
localStorage.removeItem('myJSON');
Get item:
var JSONString = localStorage.getItem('myJSON');

There are several types of browser storage such as localStorage they are all built in and can be used directly.
Storage objects are a recent addition to the standard. As such they may not be present in all browsers.........The maximum size of data that can be saved is severely restricted by the use of cookies.
Code sample:
function storeMyContact(id) {
var fullname = document.getElementById('fullname').innerHTML;
var phone = document.getElementById('phone').innerHTML;
var email = document.getElementById('email').innerHTML;
localStorage.setItem('mcFull',fullname);
localStorage.setItem('mcPhone',phone);
localStorage.setItem('mcEmail',email);
}
On the other hand, localStorage might not be enough, therefore, external libraries come to hand which actually utilize the browsers built in storage and make the db works cross browsers.
1- SQL like DB sequelsphere (looks like suitable for heavy lifting!)
Code sample for query that will run directly from the browser:
SELECT empl_id, name, age
FROM empl
WHERE age < 30
2- JSON like DB taffydb (looks like suitable for every day activity!)
// Create DB and fill it with records
var friends = TAFFY([
{"id":1,"gender":"M","first":"John","last":"Smith","city":"Seattle, WA","status":"Active"},
{"id":2,"gender":"F","first":"Kelly","last":"Ruth","city":"Dallas, TX","status":"Active"},
{"id":3,"gender":"M","first":"Jeff","last":"Stevenson","city":"Washington, D.C.","status":"Active"},
{"id":4,"gender":"F","first":"Jennifer","last":"Gill","city":"Seattle, WA","status":"Active"}
]);
// Find all the friends in Seattle
friends({city:"Seattle, WA"});
3- jstorage is a cross-browser key-value store database to store data locally in the browser - jStorage supports all major browsers, both in desktop (yes - even Internet Explorer 6) and in mobile.
If you would like to have more options ->(client-side-browser-database)

Related

Limit length array-type of property in Loopback 4 query?

I've been trying this new framework Loopback 4 and its awesome but I dont know to which point is flexible,I'm having the following model on the database:
{
"id": "string",
"lastUpdate": "2020-10-01T18:10:46.306Z",
"name": "string",
"logo": "string",
"data": [
{}
]
}
And what I'm trying there is to make a query that returns me the data, but as is an array, it has a lot of data and I would like to paginate it, so I thought on limiting the query. I've achieved a query to look like the following:
{
"offset": 0,
"limit": 10,
"skip": 0,
"where": {
"name": {"eq":"BengalaSpain"}
},
"fields": {
"data": true
}
}
I'm trying to limit the data property to 10, but of course, this dosnt affects the property itself, just the wrapper object around it. Is there any way to achieve what im trying?
Thanks in advance guys!
LoopBack 4 filters apply at a Repository level as these constraints are passed to the ORM datasource connectors to be converted into their respective native queries (e.g. TOP 10 for SQL Server).
A possible solution is to link the data field into a Relation. Relations essentially create nested Repositories (e.g. hasManyRepository), hence are able to meet the requirement of isolating data into its own Repository.
To quickly create a relation, remove the property from the Model and re-create it using lb4 relation command.
From there, it would be possible to take advantage of the now-enabled InclusionResolver and write use query:
{
"where": {
"name": {"eq":"BengalaSpain"}
},
"fields": {
"data": true
},
"include": [
{
"relation": "<relation name here>",
"scope": {
"limit": 10
}
}
]
}
A side-effect is the separation of data into its own table. However, this should be done irregardless due to database normalization.

How to just replace my one property of json object in json array which have large no of json objects in nodejs?

I have searched my many online articles of parsing json array or there exists any npm package to do it.But my all efforts gone in vain.
I have an json array like this =>
{
"pctProjects": [
{
"ID": "1",
"Name": "Software Upgrade",
"Desc": "GO! V1 Chapter 5- EOC Mastery Exercise",
"AppId": "1",
"UserId": "1",
"CreatedDate": "2008-07-30T00:00:00",
"Score": "100",
"SeriesID": "2",
"IsPublished": "1",
"PublishedLMSVariationID": "5",
"IsPCTActive": "0",
"IVTEnabled": "0",
"IsActiveInSelectPopup": "1",
"Chapter": "CH05"
},
{
"ID": "2",
"Name": "Business Venture",
"Desc": "Exploring Volume 1 Chapter 3- TST Exercise",
"AppId": "1",
"UserId": "1",
"CreatedDate": "2008-07-30T00:00:00",
"Score": "100",
"SeriesID": "1",
"IsPublished": "1",
"PublishedLMSVariationID": "7",
"IsPCTActive": "0",
"IVTEnabled": "0",
"IsActiveInSelectPopup": "1",
"Chapter": "CH03"
}
.
.
.
I looped through my json array "pctProjects" using Json.parse method and able to find the object using the property PublishedLMSVariationID which i need to replace with another value, i'm using filesystem module functions like appendFileSync() and writeFileSync() to update the file.But using this methods i have to rewrite other objects data also which i'm not changing and this is not optimised method to do this as i can have n no of objects in that array.
And Using replace-in-file also not helping me to achieve my goal.
Also adding my code snippet what i'm doing right now which is not optimised.
for(let item of gulpJson.pctProjects){
// console.log(typeof touseVariationId)
if(counter==1 && item.PublishedLMSVariationID == results[0]){
item.PublishedLMSVariationID = results[1]
fs.writeFileSync(path.join(dir,'PCT5_MasterPCTProjectsForGulp.json'), JSON.stringify(item,null,4));
counter++;
}
else if(counter==1 && item.PublishedLMSVariationID != results[0]){
fs.writeFileSync(path.join(dir,'PCT5_MasterPCTProjectsForGulp.json'), JSON.stringify(item,null,4));
counter++;
}
else if(item.PublishedLMSVariationID == results[0]){
item.PublishedLMSVariationID = results[1]
fs.appendFileSync(path.join(dir,'PCT5_MasterPCTProjectsForGulp.json'), JSON.stringify(item,null, 4));
// break
// fs.writeFileSync(path.join(dir,'PCT5_MasterPCTProjectsForGulp.json',null, 2), JSON.stringify(item));
}
else
fs.appendFileSync(path.join(dir,'PCT5_MasterPCTProjectsForGulp.json'),","+"\n"+"\t"+ JSON.stringify(item,null, 4));
}
Questions:
Is there any way to just replace my one json property in json array in nodejs???
Much Obliged:).Thanks in advance.
Please comment if any more information is needed.
From what I understand, you are asking if You can edit your JSON file on the file system differentially without rewriting the whole thing. Although I'm sure that that is possible, I would recommend simply rewriting your entire JSON file each time you want to update it. If the JSON is so huge that this becomes too tedious/time consuming, you may be better off using a DB of some sort (MySQL, Mongo, Firebase etc).
My recommendation would be to do it in the following order:
Retrieve JSON string from file (lets call it source.json)
Parse JSON string to get Object using JSON.parse
Update the object you want to update in the parsed object by looping over it and overwriting values as needed.
Use JSON.stringify to get back a string representation (of the entire object as obtained in step 2) and overwrite your source.json with the new JSON
I am able to achieve my result in an optimised way by using the npm replace-in-file module.
Here, is my code snippet where i have used that module:
replace.sync({
files: path.join(dir,'PCT5_MasterPCTProjectsForGulp1.json'),
from: results[0],
to: results[1]
});

Facebook Graph API - get page feed + full event info?

Is it possible to, in 1 request, get the feed of a page but with the full event info?
As it is now, if a shared event is posted, you only get back the link to that event, no picture or title:
{
"id": "xxx",
"from": {
"category": "Community",
"name": "xxx",
"id": "xxx"
},
"story": "xxx shared xxx's event.",
"link": "https://www.facebook.com/events/xxx/",
"actions": [
{
"name": "Comment",
"link": "https://www.facebook.com/xxx/posts/xxx"
},
{
"name": "Like",
"link": "https://www.facebook.com/xxx/posts/xxx"
}
],
"privacy": {
"value": ""
},
"type": "link",
"status_type": "shared_story",
"application": {
"name": "Links",
"id": "xxx"
},
"created_time": "2013-06-19T10:05:50+0000",
"updated_time": "2013-06-19T10:05:50+0000",
"likes": {
"data": [
{
"name": "xxx",
"id": "xxx"
}
],
"count": 1
}
}
If I understand correctly, you need to retrieve the events but you want to do it all at once with the feed because you want to retrieve the information on the feed anyway.
Before doing that, you must know that the feed doesn't contain all the events.... Once created, a link to the event is automatically shared on the page feed. It is only a reference, which can then be hidden. The event won't be displayed on the feed anymore even if it still exists.
Requesting two different objects at the same time
So, the feed doesn't have the events information and the events and posts (feed) are stored on 2 different tables. Therefore, you need to get the events independently from the feed:
The feed /PAGE_ID/feed
The events /PAGE_ID/events
And, as you wanted, Graph API allows you to do this in only one request:
/PAGE_ID?fields=feed,events
Additional fields
Note that either feed or events accept the limit and fields parameters. For example, events can be specified by:
events.limit(100).fields(location,name,owner,description,updated_time,venue)
Possible fields are given in the doc.
There is no way to get the "full info" at once. You will have to specify each field in the request. So, don't get the "full info", but just the information you really need.
There is a post from Facebook addressing this scenario using multi-queries and FQL (Facebook Query Language). This will allow you to make multiple FQL calls in one request.
https://developers.facebook.com/docs/technical-guides/fql/#multi

How to translate Solr JSON response into HTML while JSON is different every time

I am using Solr 4 for searching in a java web application.Solr produces a JSON response from which i have to extract search results and translate them into html so user can read that.
I know one solution but it seems dumb an I think there must be intelligent ideas.
{
"responseHeader": {
"status": 0,
"QTime": 0,
"params": {
"fl": "id,title",
"indent": "true",
"q": "solr",
"wt": "json"
}
},
"response": {
"numFound": 3,
"start": 0,
"docs": [
{
"id": "1",
"title": "Solr cookbook"
},
{
"id": "2",
"title": "Solr results"
},
{
"id": "3",
"title": "Solr perfect search"
}
]
}
}
After that i eval this text as:
var obj = eval ("(" + txt + ")");
To generate html page i can use either
<script>
document.getElementById("id").innerHTML = obj.response.docs[1].id
document.getElementById("title").innerHTML = obj.response.docs[1].title
</script>
or
document.write(obj.response.docs[1].id);
But limitation is that every time solr gives response with different object structure i.e. an object may have age feild but other can not have because it depends on query.
I want to use a sigle JSP page to display search results(like Google)
for all search queries
is it possible to write a single code segment which works for any possible search results with different schema.
Javascript stops working after encountering any error which is likely in my case. that's also problem.if I use for loop to traverse the object hierarchy it is highly error -prone.
Is it possible with a single view page Thanks.
You might want to consider using ajax-solr - A JavaScript framework for creating user interfaces to Solr
I suggest using Velocity templating which is readily supported in Solr - instead of extracting data from the JSON and rendering the HTML via JS.
Docs here

Parse JSON from local url with JQuery

I have a local url where i can retrieve a json file. I also have a simple website which is build using JQuery.
I've looked up many sites for tutorials and sample code on how to retrieve the json input and parse it so i can display it on my site. However non were helpful as i still can't make it work.
So as a last resort i'm going to ask stackoverflow for your help. I have a lot of java knowledge, but I'm relative new to 'web'-development and know some basics of javascript.
This is a sample output of my url:
[
{
"baken": "not implemented...",
"deviceType": "Optimus 2X",
"batteryLevel": "1.0",
"gps": {
"speed": 0,
"Date": "TueNov0100: 34: 49CET2011",
"Accuracy": 35,
"longitude": {removed},
"latitude": {removed},
"Provider": "gps"
},
"deviceId": "4423"
},
{
"baken": "notimplemented...",
"deviceType": "iPhone",
"batteryLevel": "30.0",
"gps": {
"speed": 0,
"Date": "TueNov0116: 18: 51CET2011",
"Accuracy": 65,
"longitude": {removed},
"latitude": {removed},
"Provider": null
},
"deviceId": "4426"
}
]
Hope you can help me..
If you are running a local web-server and the website and the json file are served by it you can simply do:
$.getJSON('path/to/json/file.json', function(data) {
document.write(data);
})
If you are just using files and no webserver you might get a problem with the origin-policy of the browser since AJAX request cannot be send via cross-domain and the origin domain is 'null' per default for request from local files.
If you are using Chrome you can try the --allow-file-access-from-files parameter for developing purposes.
Your URL returns invalid json. Try pasting it in jsonlint.com and validating it there and you'll see what I mean. Even the code highlighting here on stackoverflow is showing you what's wrong. :)
Edit: To parse it you can use jQuery.parseJSON
jQuery.parseJSON('{"foo": "goo"}');
$.get('/some.json', function(data) {
// data[0]["baken"] == "not implemented..."
});
See http://api.jquery.com/jQuery.get/
You don't need to parse the json -- that is why people like it. It becomes a native JavaScript object.
For your example if you put the results in a variable called data then you could do things like this:
data[0].deviceType // would be "Optimus 2x"
data[0].gps.speed // would be numeric 0
etc.
The most natural way is to allow jQuery to make an AJAX call for you once you've already entered the page. Here's an example:
$.ready(function() {
// put your other code for page initialization here
// set up a global object, for namespacing issues, to hold your JSON.
// this allows your to be a good "web" citizen, because you will create
// one object in the global space that will house your objects without
// clobbering other global objects from other scripts, e.g., jQuery
// makes the global objects '$' and 'jQuery'
myObjects = {};
// start JSON retrieval here
$.getJSON('/path/to/json/file.json', function(data) {
// 'data' contains your JSON.
// do things with it here in the context of this function.
// then add it to your global object for later use.
myObjects.myJson = data;
});
});
The API documentation is here

Categories

Resources