Can I save any file in Parse javascript? - javascript

I am creating a web application, where in the text area the user can type in anything, and save it as any type of file (.doc, .txt, .java, .js and ect.) Sort of like Notepad, but its on the web. Can I save the file in Parse if the file type is java? I been only able to find tutorials on how to save image files in parse.
Here is my code for saving the file in Parse:
var usercode = req.body.code;
newname = name+".java";
var parsefile = new Parse.File(newname, {base64:usercode});
parsefile.save().then(function(){
console.log("File created and saved in Parse")
}, function(error){
console.log(error);
});
If my code is successful on my console I should see file created and saved in parse. However I don't get anything, not even an error msg.

I believe you can.
Parse stores PFFile's as binary data, as long as you have some way of serializing the data format when it goes in and comes back out it should be fine.
Source: https://parse.com/docs/osx/api/Classes/PFFile.html

Parse.file(Name, content). content must be in byte, these bytes must be stored in array.
So:
var bytes = []
for(x =0; x<content.length; x++){
bytes.push(content.charCodeAt(i));
}
then Parse.file(Name, bytes)

Related

Reading changes in JSON file using JS

I'm trying to read an updating JSON file from syslog-ng. Currently, syslog, a logging software, is set to continually append a JSON file with logs of the data I want. I'm displaying the data on my cyber attack map only for only 30 seconds until it's not needed anymore. I can read the file and parse what I need, but is there a way to, over time, read & parse only the most recent additions to the file?
Sample code:
//Assume JSON output = {attack source, attack destination, attack type}
//Required modules
var JSONStream = require('JSONStream')
var fs = require('fs');
//Creates readable stream for JSON file parsing
var stream = fs.createReadStream( 'output.json', 'utf8'),
parser = JSONStream.parse(['source', 'dest', 'type']);
//Send read data to parser function
stream.pipe(parser);
//Intake data from parser function
parser.on('data', function (obj) {
//Do something with the object
console.log(obj);
});
I'm using JSONStream to avoid having to read the whole log file into memory, JSONstream should still be able to parse the bits I want, but is there a method to only read changes after the original reading is complete?
Use this code example provided in the library
JSONStream Test code
You don't have to wait for the end, you can use the callback to do your work object by object
But the file structure should suite the library expectation as the files given in the folder
Example file all_npm.json

Read pdf as ArrayBuffer to store it in json file with detailed information

I want to build a wrapper around a pdf document to store more information. I tried the FileReader with reader.readAsBinaryString() but this broke the pdf file (some parts like images were missing).
So I tried reader.readAsArrayBuffer() which seems to get the content without any damage. But I don't know to to convert the ArrayBuffer to a string so I can write it's value into a json file to export it.
When I use btoa(new TextDecoder("utf-8").decode(e.target.result))
I get an error : The string to be encoded contains characters outside of the Latin1 range.
That sounds like a terrible idea in general, but anyway, might help someone else...
The easiest and most reliable way to encode a binary file to a string is to encode it as base64.
The FileReader API has a readAsDataURL() method, which will return a data URI composed of both an URI header and the base64 binary data.
So all you need if you want only the data as string, is to grab whatever comes after "base64," in the returned dataURI.
inp.onchange = e => {
const reader = new FileReader();
reader.onload = e => {
var myObj = {
name: inp.files[0].name,
data: reader.result.split('base64,')[1]
};
console.log(JSON.stringify(myObj));
};
reader.readAsDataURL(inp.files[0]);
};
<input type="file" id="inp">
Now, I can't advice to store a whole pdf file, moreover which contains images in a JSON file. Encoded as base64 the binary data will grow by 34% (IIRC). So you might want to consider saving both the meta-data and the original pdf file in a single binary compressed file (e.g zip).

How to post and get data uri image from web api?

I am trying to post data uri image from javascript to backend asp.net webapi. However, it gives me input is not a valid Base-64 string error. Now, I understand that it may be due to "data:image/png;base64," part that the data uri contain.
Now, even if I remove this part from the data uri and send only the rest of the string to server, how do I store the Base-64 string on the server?
Moreover, how to retrieve this data as image from webapi?
NOTE: Image would be less than 200kB size and hence is to be stored as varbinary(Max) in sql server.
The thing is you should convert your image to byte[] and store it in your server as varbinary
byte []arr = new byte[image1.ContentLength];
image1.InputStream.Read(arr, 0, image1.ContentLength);
While retrieving you should convert the varbinary data to base64 string and base 64 string to image
string imageBase64Data = Convert.ToBase64String(img);
Here comes the important part the above code convert varbinary to base64string.It should be in proper format to display the image.Thats what the following code does
string imageDataURL = string.Format("data:image/png;base64,{0}", asd);
Session["Photo"] = imageDataURL;
Now you can able to view your image
Post the image from your client in a for of string, w/o specifying the type. On your action you can get the image in the following way :
var bytes = Convert.FromBase64String(yourStringHere);
using (var ms = new MemoryStream(bytes))
{
image = Image.FromStream(ms);
}

How do I encode/decode a file correctly after reading it through javascript and pass the file data through ajax?

I have a django File field with multiple attribute set to true. I am trying to make a multiple file uploader where I get the file objects with a simple javascript FileReader object. After looping through the file list I read the file data through
reader.readAsBinaryString(file);
and get the desired file data result. After passing this data to my views through ajax I am trying to create a copy of the file into the media folder. I am presently using the following views function :
#csrf_exempt
def storeAttachment(data):
'''
stores the files in media folder
'''
data = simplejson.loads(data.raw_post_data)
user_org = data['user_org']
fileName = data['fileName']
fileData = data['fileData']
file_path = MEDIA_ROOT + 'icts_attachments/'
try:
path = open((file_path+ str(user_org) + "_" + '%s')%fileName, "w+")
path.write(fileData)
path.close()
return HttpResponse(1)
except IOError:
return HttpResponse(2)
I am able to write simple text files,.js,.html and other few formats but when I try to upload pdf, word, excel, rar formats I get the following error in my response even though a file with invalid data is saved at my MEDIA path(the file does not open).
'ascii' codec can't encode characters in position 41-42: ordinal not in range(128)
I tried to encode/decode the file data using various references but with no effect..Any advice will be greatly appreciated..
You got error because Python 2's default ASCII encoding was used. Characters greater than 127 cause an exception so use str.encode to encode from Unicode to text/bytes.
Good practice is to use with keyword when dealing with file objects.
path = u''.join((file_path, user_org, '_', fileName)).encode('utf-8')
with open(path, 'w+') as f:
f.write(fileData)

How go I get csv data into netsuite?

I've got an update to my question.
What I really wanted to know was this:
How do I get csv data into netsuite?
Well, it seems I use the csv import tool to create a mapping and use this call to import the csv nlapiSubmitCSVImport(nlobjCSVImport).
Now my question is: How do I iterate through the object?!
That gets me half way - I get the csv data but I can't seem to find out how I iterate through it in order to manipulate the date. This is, of course, the whole point of a scheduled script.
This is really driving me mad.
#Robert H
I can think of a million reasons why you'd want to import data from a CSV. Billing, for instance. Various reports on data any company keeps and I wouldn't want to keep this in the file cabinet nor would I really want to keep the file at all. I just want the data. I want to manipulate it and I want to enter it.
Solution Steps:
To upload a CSV file we have to use a Suitelet script.
(Note: file - This field type is available only for Suitelets and will appear on the main tab of the Suitelet page. Setting the field type to file adds a file upload widget to the page.)
var fileField = form.addField('custpage_file', 'file', 'Select CSV File');
var id = nlapiSubmitFile(file);
Let's prepare to call a Restlet script and pass the file id to it.
var recordObj = new Object();
recordObj.fileId = fileId;
// Format input for Restlets for the JSON content type
var recordText = JSON.stringify(recordObj);//stringifying JSON
// Setting up the URL of the Restlet
var url = 'https://rest.na1.netsuite.com/app/site/hosting/restlet.nl?script=108&deploy=1';
// Setting up the headers for passing the credentials
var headers = new Array();
headers['Content-Type'] = 'application/json';
headers['Authorization'] = 'NLAuth nlauth_email=amit.kumar2#mindfiresolutions.com, nlauth_signature=*password*, nlauth_account=TSTDRV****, nlauth_role=3';
(Note: nlapiCreateCSVImport: This API is only supported for bundle installation scripts, scheduled scripts, and RESTlets)
Let's call the Restlet using nlapiRequestURL:
// Calling Restlet
var output = nlapiRequestURL(url, recordText, headers, null, "POST");
Create a mapping using Import CSV records available at Setup > Import/Export > Import CSV records.
Inside the Restlet script Fetch the file id from the Restlet parameter. Use nlapiCreateCSVImport() API and set its mapping with mapping id created in step 3. Set the CSV file using the setPrimaryFile() function.
var primaryFile = nlapiLoadFile(datain.fileId);
var job = nlapiCreateCSVImport();
job.setMapping(mappingFileId); // Set the mapping
// Set File
job.setPrimaryFile(primaryFile.getValue()); // Fetches the content of the file and sets it.
Submit using nlapiSubmitCSVImport().
nlapiSubmitCSVImport(job); // We are done
There is another way we can get around this although neither preferable nor would I suggest. (As it consumes a lot of API's if you have a large number of records in your CSV file.)
Let's say that we don't want to use the nlapiCreateCSVImport API, so let's continue from the step 4.
Just fetch the file Id as we did earlier, load the file, and get its contents.
var fileContent = primaryFile.getValue();
Split the lines of the file, then subsequently split the words and store the values into separate arrays.
var splitLine = fileContent.split("\n"); // Splitting the file on the basis of lines.
for (var lines = 1,count=0; lines < splitLine.length; lines++)
{
var words = (splitLine[lines]).split(","); // words stores all the words on a line
for (var word = 0; word < words.length; word++)
{
nlapiLogExecution("DEBUG", "Words:",words[word]);
}
}
Note: Make sure you don't have an additional blank line in your CSV file.
Finally create the record and set field values from the array that we created above.
var myRec = nlapiCreateRecord('cashsale'); // Here you create the record of your choice
myRec.setFieldValue('entity', arrCustomerId[i]); // For example, arrCustomerId is an array of customer ID.
var submitRec = nlapiSubmitRecord(myRec); // and we are done
fellow NetSuite user here, I've been using SuiteScripts for a while now but never saw nlobjCSVImport object nor nlapiSubmitCSVImport .. I looked in the documentation, it shows, but there is no page describing the details, care to share where you got the doc from?
With the doc for the CSVImport object I might be able to provide some more help.
P.S. I tried posting this message as a comment but the "Add comment" link didn't show up for some reason. Still new to SOF
CSV to JSON:
convert csv file to json object datatable
https://code.google.com/p/jquery-csv/
If you know the structure of the CSV file, just do a for loop and map the fields to the corresponding nlapiSetValue.
Should be pretty straightforward.

Categories

Resources