JS / JSON / JQuery - How to insert Data into Json file lil css - javascript

Today i want to create a page that:
inserts things from a json file into divs but also allows you to add new infos to that json.
also i wanted to create a like button with increasing counter and sort function based on the likes.
I dont know anything about php or ajax. its maybe possible to solve without it ?
look here is my code. its not working yet obviously :)
But i hope you get the idea. I basically think iam heading the right direction but just missing some pieces :)
For designing i did put a div which containts 2 divs // 1 for img and 1 for text. i later want to style it with css so that each row there will be 2 filmbox divs and with css flexbox i want wrap it so when windows gets smaller only 1 box each row.
Like i make .filmbox{ width: 40%;} so theres space for only 2 each row and the other things gets wrapped down
LAYOUT OF THE PAGE IMAGE LINK
// Copy Pastaed this ajax from internet to read json data from external file
let readJSON = function (file) {
let json = {}
$.ajax({
'async': false,
'global': false,
'url': file,
'dataType': "json",
'success': function (data) {
json = data;
}
});
return json;
};
let film = readJSON("film.json")
console.table(film)
// Insert new Data to webpage
for (let i in film){
$(".content:eq("+i+")").append(`<div class="filmbox"><div class ="imgfield">
<img src="${film[i].img}" ></div>"<div class="textfield"> <h1>" + film[i].name + "</h1>
<br>" + film[i].description + "<br>" +
"<button type="button" id="button">LIKE</button>" + "<span id="likeCounter">
</span></div></div>`);
}
// Function to insert new Data into JSON from a from - Update website on Form click
let film = []
function getValues(){
let filmName = $(#fname).value;
let description = $(#fdescription).value;
let img = $(#fimg).value;
let filmData = [{
"name" : ${filmname},
"description" : ${description},
"img" : $${img}
}]
$(#formclick)on("click", ()=>{
**// how to insert to json file ?**
window.location.reload();
})
let counter = 0
// Like Button Function
$(#button).on("click", () =>{
$(#likeCounter).text(counter++)
)};
// SORT FUNCTION to first display films with the most likes
function orderDivs(){
counter.sort()
$(#button).find($counter) // Finds the value of likke button+
**Well and now sort all the Divs according to the like value the users did
input (it dont need to save the like value somewhere just when i like some films
on the page by randomly clicking like it then should sort the divs according to that**
}
$(#sortbutton).on("click", ()=>{
orderDivs()
})

To save the data to a JSON file you'd typically post the data to a server that has file system access (e.g. PHP or Node.js).
If you want to do this client-side, you could look at the FileSystem API, or persist the data in the browser with localStorage.
Regarding the like button / count increment / sorting, it would be better to post this as a separate, more specific question.

Related

Turning API data into HTML

sorry for the newbie question.
I'm new to using API's and I want to take the data from here
https://api.guildwars2.com/v2/commerce/prices/24615 (specifically the unit price under sells:) and display this in HTML.
This project will be using the unit price data across roughly 100 id's and I'd like to organize these numbers and run some basic math with them.
How can I go about this?
fetch('https://api.guildwars2.com/v2/commerce/prices/24615')
.then(function(response) {
return response.json();
})
.then(function(myJson) {
console.log(JSON.stringify(myJson));
});
So far I can get the data into the console, but I'm not sure how to turn this into something I can work with.
There is function in jQuery to make API calls, the following code makes you access api data,
$(document).ready( function() {
var info;
var whitelisted;
var quantity;
$.get("https://api.guildwars2.com/v2/commerce/prices/24615",function(obj){
info = obj['id'];
whitelisted = obj["whitlelisted"]
quantity = obj.buys['quantity']
$("#id1").html("id :"+info);
$("#whitelist").html("whitelisted :"+whitelisted);
$("#quan").html("quantity :"+quantity);
});
});
for more info you could look into the following pen link to the code
ignore these jerk moderators.
In your callback function, where you log myJson, edit your previously created html file,
like if you have a div, <div id="myDiv"> </div>, in the response function do something like this
const myDiv = document.getElementById("myDiv")
myDiv.textContent = myJson.name
And your div will show the name from json, or whatever you need. Play around with these ideas and you'll get far

Append new files

I have an api running that fetches all the file names in the directory and returns me an array inside an array. Currently, I am running it every second to check if a new file is added and if so... embed it to my div. The issue is that I have to empty my html every time and then re-embed the html. Is there a better way to do this? That way I only embed new filenames rather than all again.
setInterval(function(){
$.ajax({
url : 'getFiles',
success: function (data) {
$("#pics").html("");
console.log(data);
$.each(data, function (k, o) {
$.each(o, function (key, obj) {
$("#pics").append("<a href='#'>" + obj + "</a>");
});
});
}
});
}, 1000);
const images = [];
setInterval(function(){
$.ajax({
url : 'getFiles',
success: function (data) {
const fetchedImages = data.images;
if(images.length !== fetchedImages.length){ //They do not have the same elements
images = fetchedImages;
$("#pics").html("");
const domImages = fetchedImages.map(image => "<a href='#'>" + image + "</a>");
$("#pics").append(domImages.join(''));
}
}
});
}, 1000);
From our discussion i was able to create this solution.
Since you know that you only need a list of images, then you can just get it directly.
Then you can check if the images which are saved locally have the same amount of elements which you got from the server.
If they do not match, then it must mean that the list has been changed (a side-effect could be that someone changed the name of a file, then the length would be the same)
Now we just empty the #pics HTML, create a new array where each element is wrapped in an <a> tag
Lastly join just takes an array and converts it to a string. '' means that there shouldn't be any text between each element, so the string looks like this
"<a href='#'>image1.jpg</a><a href='#'>image2.jpg</a><a href='#'>image3.jpg</a>"
In your case, I will suggest to keep current implementation: clean all and generate the list by new received data. The reason is:
From performance view, clean all and regenerate it will faster than for each compare and check if duplicated => keep, remove old item or insert new item
the order of item can be easily kept as the received data. Won't be confused with the old item list.
The rule I suggest is: if the new list is totally the same, return without change directly. If the list has been changed, clean all and rebuild the list directly.

Shopify Access a product with its id on thank you page without using '/admin' in url

I am trying to access a specific product using its id from the below url,
https://tempstore.myshopify.com/products/1234.json
Its giving me 404 error.
Although, I am able to access all products as below:
https://tempstore.myshopify.com/products.json
I have to access the product which was just processed in checkout process.
I have its id as below:
var products = Shopify.checkout.line_items;
products will contain an array of product id's only which are processed in checkout.Now I need to access all other properties of these products.
I can surely do this:
https://tempstore.myshopify.com/admin/products/1234.json
But it requires Authentication.
Any thoughts?
From the frontend, you need to have the product handle to get the JSON object:
https://tempstore.myshopify.com/products/[handle].js
or
https://tempstore.myshopify.com/products/[handle].json
(Note that the returned values from the .js and .json endpoints are quite different from each other!)
Like you point out, the Shopify.checkout.line_items array of objects only has the product IDs, not the product handles. We're not completely out-of-luck, though, because we can get the entire list of products in the store including the product handles by hitting the /products.json endpoint.
Of course, this means grabbing a potentially huge JSON object just to get information that we should've had included in the checkout line items... but unless there's some alternate source of the line item information available on the checkout page, looping through the entire list may be what you need to do.
So your end code would look something like this:
Checkout.jQuery.getJSON( // Or whatever your preferred way of getting info is
'https://tempstore.myshopify.com/products.json',
function(prodlist){
for(var p = 0; p < prodlist.length; p++){
var prod = prodlist[p];
// Find if Shopify.checkout.line_items contains prod.id, left as exercise for the reader
if(found){
Checkout.jQuery.getJSON(
'https://tempstore.myshopify.com/products/' + prod.handle + '.js',
function(product){
/* Whatever needs to be done */
})
}
}
}
)
Hope this helps!
var shop = Shopify.shop;
var lineItems = Shopify.checkout.line_items;
var url = 'https://' + shop + '/products.json?callback=?';
var requiredData = [];
$.getJSON(url).done(function(data){
lineItems.forEach(function(lineItemProduct){
data.products.find(function(product){
if(lineItemProduct.product_id == product.id){
requiredData.push(product);
}
});
});
});
console.log(requiredData);
This is how I solved it, If it helps anybody :)

Jquery exporting table to csv hidden table cells

I need to be able to export a HTML table to CSV. I found a snippet somewhere; it works but not entirely how I want it to.
In my table (in the fiddle) I have hidden fields, I just use quick n dirty inline styling and inline onclicks to swap between what you see.
What I want with the export is that it selects the table as currently displayed. so only the td's where style="display:table-cell". I know how to do this in normal JS.
document.querySelectorAll('td[style="display:table-cell"])');
but how can I do this using the code I have right now in the exportTableToCSV function?
(sorry but the text in the fiddle is in dutch as its a direct copy of the live version).
The fiddle:
http://jsfiddle.net/5hfcjkdh/
In your grabRow method you can filter out the hidden table cells using jQuery's :visible selector. Below is an example
function grabRow(i, row) {
var $row = $(row);
//for some reason $cols = $row.find('td') || $row.find('th') won't work...
//Added :visisble to ignore hidden ones
var $cols = $row.find('td:visible');
if (!$cols.length) $cols = $row.find('th:visible');
return $cols.map(grabCol)
.get().join(tmpColDelim);
}
Here's how i solved it. Decided to step away from a pure javascript solution to take processing stress off the client and instead handle it server side.
Because i already get the data from the database using a stored procedure i use this to just get the dataset again and convert it into an ViewExportModel so i have a TotalViewExport and a few trimmed variations (reuse most of them) based on a Selected variable i fill a different model.
Added to the excisting show function to update a Selected variable to keep track of the currently selected view.
When the user clicks Export table to excel it calls to the controller of the current page, IE. AlarmReport (so AlarmReportController) and i created the action ExportReports(int? SelectedView);
In addition i added CsvExport as a manager. This takes data results (so c# models/ iqueryables/ lists/ etc). and puts them into a Csv set. using the return type BinaryContent one can export a .csv file with this data.
The action ExportReports calls the stored procedure with the selectedview parameter. The result gets pumped into the correct model. this model is pumped into the CsvExport model as rows.
The filename is made based on the selected view + What object is selected + current date(yyyy-MM-dd). so for example "Total_Dolfinarium_2016-05-13". lets
lastly the action returns the .csv file as download using the BinaryContent Returntype and ExportToBytes from the CsvExport
The export part of this action is programmed like so(shortened to leave some checks out like multiple objects selected etc)(data and objectnames are gathred beforehand):
public ActionResult ExportCsv(CsvExport Data, string ObjectName, string Type){
var FileName = Type + "_" + ObjectName + "_" + DateTime.Now.ToString("yyyy/MM/dd");
return BinaryContent("text/csv", FileName + ".csv", Data.ExportToBytes());
}

Using jQuery to pull text from a specific <td>

I'm running an AJAX query on an external page, and am attempting to only return the data from the County . My current script is pulling the text from all of the table cells, but I cannot for the life of me get it to simply pull the county name.
The current script that is being run:
$( ".zipCode" ).each(function( intIndex ){
var zipCodeID = $(this).attr('id');
console.log('http://www.uscounties.org/cffiles_web/counties/zip_res.cfm?zip='+zipCodeID);
$.ajax({
url: 'http://www.uscounties.org/cffiles_web/counties/zip_res.cfm?zip='+zipCodeID,
type: 'GET',
success: function(res) {
var headline = $(res.responseText).find("p").text();
console.log(headline);
$('#'+zipCodeID).empty();
$('#'+zipCodeID).append(headline);
}
});
});
An example of the page that is being queried:
http://www.uscounties.org/cffiles_web/counties/zip_res.cfm?zip=56159
This should work for all entered ZIPS. The page layout is the same, I just can't get the function to return only the county. Any help or advice would be awesome. Thanks!
With the complete lack of ids and classes on that page, you don't really have much to go on. If you have access to the source of that page, stick an id or class on the cell and make your life so much easier. If not, you'll have to use what you know about the structure of the pages to find the county. Something like this will work specifically on that one page you linked to. If other pages have slight variations this will fail:
var headline = $(res.responseText).find("table > tr:eq(2) > td:eq(3)").text();
This assumes that there is only ever one table on the page and that the county is always in the 3rd cell of the 2nd row.
You're basically screen scraping. I somehow think you'll have issues with this due to cross domain and other things, but that is ancillary to the question.
You need to walk through the resultant page. Assuming there is only ever one page on the screen, it'll look like something this:
var retVal = [];
// Basically, for each row in the table...
$('tr').each(function(){
var pTR = $(this);
// Skip the header row.
if (pTR.find('th').length == 0)
{
// This is the array of TDs in the given row.
var pCells = $('td', pTR);
retVal.push({state:$(pCells[0]).text(), place:$(pCells[1]).text(), county:$(pCells[2]).text()});
}
});
// retVal now contains an array of objects, including county.
if (retVal.length > 0)
{
alert(retVal[0].county);
}
else
{
alert('Cannot parse output page');
}
The parsing code is written to be extensible, hence you get back all of the data. With postal codes, although you will likely only ever get back one county, you'll definitely get back more places. Also note... not every zip code has a county attached for a variety of reasons, but you should get back an empty string in that case.

Categories

Resources