Problems with File plugin in Ionic2 - javascript

I'm integrating Quickblox in my Ionic2 app, for now I was able to do all the things except uploading a file.
In Quickblox you have to upload a file using a function made by them that according to js documentation looks like this:
var inputFile = $("input[type=file]")[0].files[0];
var params = {name: inputFile.name,
file: inputFile,
type: inputFile.type,
size: inputFile.size,
public: false};
QB.content.createAndUpload(params, function(err, response){
if (err) {
console.log(err);
} else {
console.log(response);
var uploadedFile = response;
var uploadedFileId = response.id;
}
});
So I translated above code to typescript and I have something like this:
uploadFile(filename) {
File.resolveDirectoryUrl(cordova.file.dataDirectory).then(
(dirEntry) => {
File.getFile(dirEntry, filename, { create: false }).then(
(fileEntry) => {
console.log(fileEntry);
fileEntry.file((file) => {
console.log(file);
var params = {
name: file['name'],
file: file,
type: file['type'],
size: file['size'],
'public': false
};
quickblox.content.createAndUpload(params,
(err, response) => {
if (err) {
console.log(err);
} else {
console.log(response);
var uploadedFileId = response.id;
var msg = {
type: 'groupchat',
body: "[attachment]",
extension: {
save_to_history: 1,
}
};
msg["extension"]["attachments"] = [{ id: uploadedFileId, type: 'photo' }];
quickblox.chat.send(this.xmpp_room_jid, msg);
}
});
})
}).catch(
(err) => {
console.log(err);
}
);
}
);
}
This work in the terms of "I get ok responses from quickblox server", but when I go to the admin console of quickblox to check the uploaded content I find out that the image I uploaded has 0 bytes.
So after checking the code for a while I compared side by side all my function calls with the example app of quickblox and the only difference I could find was in the File object.
This is the File object I get in my Ionic 2 app:
And this is the one I get in the quickblox js example:
All the others things looks identically except this File object.
I'm almost sure that this is the problem I'm having, and because I'm very newbie in this field, I couldn't find a way to cast from my File object in Ionic to something like the File object in the js example.
Thanks in advance at all for your time and help.
EDIT:
I attach the requests/responses logs from my Ionic app:

Could you please post the code you used to connect to chat, create a session, open a video call?
The documentation on quickblox is very bad and i got stuck at connecting to chat.

Related

How to read json file from storage blob container with azure function using javascript?

I'm totally new in azure and I would like to create azure function, which will read the content from azure storage container file.json.
Folder structure :
Storage account name: storageaccounttest
Container name: test
File name: file.json
File.json:
[
{
"name":"Kate",
"age":"28"
},
{
"name":"John",
"age":"30"
}
]
Cors on storage account: get enabled.
Environemnts variable added: process.env.AZURE_STORAGE_NAME and process.env.AZURE_STORAGE_KEY and process.env.AZURE_CONNECTION_STRING
I'm using VisualStudioCode to deploy the function.
I installed locally the dependencies:
"dependencies": {
"azure-storage": "^2.10.3",
"dotenv": "^8.1.0"
}
I choose the javascript -> HttpTrigger fn-> anonymus options
I'm using getBlobToText fn.
My index.js:
var storage = require('azure-storage');
var blobService = storage.createBlobService();
var containerName = 'test';
var blobName = 'file.json';
module.exports = blobService.getBlobToText(
containerName,
blobName,
function(err, blobContent) {
if (err) {
console.error("Couldn't download blob");
console.error(err);
} else {
console.log("Sucessfully downloaded blob");
console.log(blobContent);
}
});
Fn is deployed successfully, but I'm not able to see results.
After start, fn is finished with status 500, Internal Server Errror, Console: No new trace in the past 1 min(s).
What I made wrong?
Just summarized for helping others who get the same issue.
I think you were using context.binding.response to pass the blobContent value to the output response as the offical document Azure Functions JavaScript developer guide said.
Here is my sample code with Promise feature to solve it.
var azure = require('azure-storage');
var blobService = azure.createBlobService();
var containerName = 'test';
var blobName = 'file.json';
async function getBlobContent(containerName, blobName) {
return new Promise((resolve, reject) => {
blobService.getBlobToText(containerName, blobName, function(err, blobContent) {
if (err) {
reject(err);
} else {
resolve(blobContent);
}
});
});
}
module.exports = async function (context, req) {
await getBlobContent(containerName, blobName).then(
function(content) {
context.res = {
headers: {"Content-Type": "application/json"},
body: content
}
}, function(error) {
context.res = {
status: 400,
body: error
}
}
);
};
It works as the figure below.

NodeJS - Request file and zip it

I am currently in the process of creating a REST API for my personal website. I'd like to include some downloads and I would like to offer the possibility of selecting multiple ones and download those as a zip file.
My first approach was pretty easy: Array with urls, request for each of them, zip it, send to user, delete. However, I think that this approach is too dirty considering there are things like streams around which seems to be quite fitting for this thing.
Now, I tried around and am currently struggling with the basic concept of working with streams and events throughout different scopes.
The following worked:
const r = request(url, options);
r.on('response', function(res) {
res.pipe(fs.createWriteStream('./file.jpg'));
});
From my understanding r is an incoming stream in this scenario and I listen on the response event on it, as soon as it occurs, I pipe it to a stream which I use to write to the file system.
My first step was to refactor this so it fits my case more but I already failed here:
async function downloadFile(url) {
return request({ method: 'GET', uri: url });
}
Now I wanted to use a function which calls "downloadFile()" with different urls and save all those files to the disk using createWriteStream() again:
const urls = ['https://download1', 'https://download2', 'https://download3'];
urls.forEach(element => {
downloadFile(element).then(data => {
data.pipe(fs.createWriteStream('file.jpg'));
});
});
Using the debugger I found out that the "response" event is non existent in the data object -- Maybe that's already the issue? Moreover, I figured that data.body contains the bytes of my downloaded document (a pdf in this case) so I wonder if I could just stream this to some other place?
After reading some stackoveflow threads I found the following module: archiver
Reading this thread: Dynamically create and stream zip to client
#dankohn suggested an approach like that:
archive
.append(fs.createReadStream(file1), { name: 'file1.txt' })
.append(fs.createReadStream(file2), { name: 'file2.txt' });
Making me assume I need to be capable of extracting a stream from my data object to proceed.
Am I on the wrong track here or am I getting something fundamentally wrong?
Edit: lmao thanks for fixing my question I dunno what happened
Using archiver seems to be a valid approach, however it would be advisable to use streams when feeding large data from the web into the zip archive. Otherwise, the whole archive data would need to be held in memory.
archiver does not support adding files from streams, but zip-stream does. For reading a stream from the web, request comes in handy.
Example
// npm install -s express zip-stream request
const request = require('request');
const ZipStream = require('zip-stream');
const express = require('express');
const app = express();
app.get('/archive.zip', (req, res) => {
var zip = new ZipStream()
zip.pipe(res);
var stream = request('https://loremflickr.com/640/480')
zip.entry(stream, { name: 'picture.jpg' }, err => {
if(err)
throw err;
})
zip.finalize()
});
app.listen(3000)
Update: Example for using multiple files
Adding an example which processes the next file in the callback function of zip.entry() recursively.
app.get('/archive.zip', (req, res) => {
var zip = new ZipStream()
zip.pipe(res);
var queue = [
{ name: 'one.jpg', url: 'https://loremflickr.com/640/480' },
{ name: 'two.jpg', url: 'https://loremflickr.com/640/480' },
{ name: 'three.jpg', url: 'https://loremflickr.com/640/480' }
]
function addNextFile() {
var elem = queue.shift()
var stream = request(elem.url)
zip.entry(stream, { name: elem.name }, err => {
if(err)
throw err;
if(queue.length > 0)
addNextFile()
else
zip.finalize()
})
}
addNextFile()
})
Using Async/Await
You can encapsulate it into a promise to use async/await like:
await new Promise((resolve, reject) => {
zip.entry(stream, { name: elem.name }, err => {
if (err) reject(err)
resolve()
})
})
zip.finalize()

Api Youtube Upload with Service Account

I want to upload a video from server to youtube.
I create a service account to auth,
var path = "720_final.mov"
// var key = readJson(`${__dirname}/secret.json`);
var google = require('googleapis');
var youTube = google.youtube('v3');
var authClient = new google.auth.JWT(
"xxxx#yyyyyyy.gserviceaccount.com",
"secret.p12",
null,
['https://www.googleapis.com/auth/youtube','https://www.googleapis.com/auth/youtube.upload'],
null
);
authClient.authorize(function(err, tokens) {
if (err) {
console.log(err);
return;
}
authClient.setCredentials(tokens);
var req = Youtube.videos.insert({
auth: authClient,
resource: {
// Video title and description
snippet: {
title: "Testing YoutTube API NodeJS module"
, description: "Test video upload via YouTube API"
}
// I don't want to spam my subscribers
, status: {
privacyStatus: "public"
}
}
// This is for the callback function
, part: "snippet,status"
// Create the readable stream to upload the video
, media: {
body: fs.createReadStream(path)
}
}, (err, data) => {
console.log(err)
console.log(data)
process.exit();
});
My file 720_final.mov has size 33Mb.
When upload to 34.6Mb seem like Upload cannot find channel of user to upload ??
However, when is use tokens generator from Oauth2, upload success, example use oath 2 https://github.com/IonicaBizau/youtube-api/blob/master/example/index.js
Anyone can give me a resolve or a keyword to fix this?

ionic download file, save to temporary filesystem and open with default app

I've searched far and wide but couldn't find an appropriate answer for my use-case. Basically I'd like to download a base64-encoded file (could be a pdf, pgn, jpeg - using a pdf in the following example) from my backend, save it to a TEMPORARY fileSystem folder, then open it - possibly using the appropriate app on the device, if present. Let /file be a route served by the following ASP.NET MVC WebAPI Controller:
public class FileController : ApiController
{
// POST api/file/
public HttpResponseMessage Post([FromBody]string fullPath)
{
HttpResponseMessage result = new HttpResponseMessage(HttpStatusCode.OK);
var stream = new FileStream(fullPath, FileMode.Open);
result.Content = new StreamContent(stream);
result.Content.Headers.ContentType = new MediaTypeHeaderValue("application/octet-stream");
return result;
}
}
I came up with the following AngularJS script:
$scope.download = function(fileName) {
window.requestFileSystem(window.TEMPORARY, 1024*1024*500, function(fileSystem) {
fileSystem.root.getDirectory("TempDir", {create: true}, function(dirEntry) {
dirEntry.getFile(fileName, {create: true, exclusive: false}, function(fileEntry) {
$http.post("/file", JSON.stringify(fileName), {
headers: {Accept: "application/octet-stream"}
}).success(function(res) {
fileEntry.createWriter(function(fileEntryContent) {
var blob = new Blob([res], {type: "application/pdf"});
fileEntryContent.write(blob);
//Now load file
$scope.load(fileName);
});
}).error(function(err) {
console.warn(err);
});
}, function(err) {
console.warn("getFile failed:", err);
});
}, function(err) {
console.warn("getDirectory failed:", err);
});
}, function(err) {
console.warn("requestFileSystem failed:", err);
});
};
$scope.download("foo.pdf");
$scope.load = function(fileName) {
window.requestFileSystem(window.TEMPORARY, 1024*1024*500, function(fileSystem) {
fileSystem.root.getDirectory("TempDir", {create: false}, function(dirEntry) {
dirEntry.getFile(fileName, {create: false, exclusive: false}, function(fileEntry) {
//This is where the magic needs to happen!
}, function(err) {
console.warn("getFile failed:", err);
});
}, function(err) {
console.warn("getDirectory failed:", err);
});
}, function(err) {
console.warn("requestFileSystem failed:", err);
});
};
I'm currently stomped at the loading phase: tried window.opening the base64-encoded content of the file, http.getting the fileEntry.toURL() but nothing seems to work. I checked out Cordova's File Opener 2 plugin but it seems it can only open files stored on the device's sdcard or such. Any clue is welcomed! Cheers.
FileOpener2 plugin is pretty much you're only option I think - and it can work in this scenario. You'll need to make sure the file is saved outside your app's container on the device, as other apps cannot access this. You can find the file structure for each platform, and what is/isn't public, on the plugin page. You'll also need to save to different locations depending on the platform. This works for me:-
if (ionic.Platform.isIOS()) storagePath = cordova.file.cacheDirectory + "/temp";
else if(ionic.Platform.isAndroid()) storagePath = cordova.file.externalRootDirectory + "/yourapp/temp";
You can then use storagePath as the base for your targetPath when downloading. I would highly recommend using ngCordova. The sample below is partially based on something that I'm successfully using on iOS and Android but I haven't tested it
// add your options here, i.e. any headers you need to add to the request
var options = {};
$cordovaFileTransfer.download(url, targetPath, options, true).then(function(result) {
//On successful transfer try and extract the file object
result.file(function (file) {
var localFile = file.localURL;
resolveLocalFileSystemURL(localFile, function (entry) {
var nativePath = entry.toURL();
// Open in another app, will fail if app doesn't exist that can open the mime type
$cordovaFileOpener2.open(nativePath, mime).then(function () {
// Success!
}, function (err) {
// Error!
});
}, function(error) {
//handle error here
});
}, function (error) {
// handle error here
});
}, function (err) {
//handle error here
});

JavaScript aws-sdk S3 deleteObject(s) succedes but doesn't actually delete anything

In the MEAN.js app I'm building I upload images to AWS S3. I am trying to use the AWS SDK to delete unwanted images from the site but after a successful ajax call the file remains on S3.
I have required the AWS SDK like so, it works both with and without the config variables (as it should):
var aws = require('aws-sdk');
aws.config.update({accessKeyId: process.env.AWS_ACCESS_KEY_ID, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY});
For my route I have the following code:
router.post('/delete', auth, function(req,res, next){
if(req.body.key) {
var s3 = new aws.S3();
var params = {
Bucket: 'bucket name',
Key: req.body.key
};
s3.deleteObject(params, function (err, data) {
if (err) {
console.log(err, err.stack);
return next(err);
}
console.log(data);
res.end('done');
I get a 200 response and {} is logged to the console but the file is not deleted from storage. I've also tried using the deleteObjects method like so:
var params = {
Bucket: 'bucket name',
Delete: {
Objects: [
{
Key: req.body.key
}
]
}
};
s3.deleteObjects(params, function (err, data) {
if (err) {
console.log(err, err.stack);
return next(err);
}
console.log(data);
res.end('done');
When I use deleteObjects I get { Deleted: [ { Key: 'file name' } ], Errors: [] } as a response but the file is still on S3.
Am I doing something wrong? I thought I followed the documentation to a T.
Also, issue occurs wether or not versioning is enabled on the bucket. With versioning enabled my response is:
{ Deleted:
[ { Key: 'file name',
DeleteMarker: true,
DeleteMarkerVersionId: 'long id' } ],
Errors: [] }
Try this one. You need to use promise() to ensure that the object is deleted before ending the execution. 6 hours waiting just for a simple object deletion is not normal, even with considering S3 99.999999999% durability.
var params = {
Bucket : bucket,
Key : video
};
try {
await s3.deleteObject(params,function(err,data){
if (err) console.log(err,err.stack);
else console.log("Response:",data);
}).promise();
} catch (e) {}
Looks like first comment was right, it takes some time for files to be removed from AWS S3. In this case it was over an hour until it disappeared (could have been 6 hours, I stepped away for quite a bit).
I really don't think Aws takes long that long to delete. I was having same issue and solved it by changing the file name value, I was initially using the url instead of using the filename i used in uploading the image.
I've noticed AWS indicates the key as deleted even though it does not exist. In my case I sent the file path as the object key, but the key was actually the file path minus the leading /.

Categories

Resources