Session.Create(FlowFile) Transfers with no content - javascript

I am trying to build a Excecute Script Processor for Nifi.
It handles a JSON file, splits it and sends it to the next processor, which is an MongoDB writer.
The logic works so far. The main problem is, that I cannot get the processor to create and send a new FlowFile for each new JSON created out of the input JSON. I got it working a bit but unfortunately, all the FlowFiles come out empty. Is there something wrong with the flow (from creation of a new Flow to sending it)?
Here is a code snippet:
var flowFile = session.get();
if (flowFile != null) {
var StreamCallback = Java.type("org.apache.nifi.processor.io.StreamCallback");
var IOUtils = Java.type("org.apache.commons.io.IOUtils");
var StandardCharsets = Java.type("java.nio.charset.StandardCharsets");
try {
flowFile = session.write(flowFile,
new StreamCallback(function (inputStream, outputStream) {
var content = IOUtils.toString(inputStream, StandardCharsets.UTF_8);
var json = JSON.parse(content);
var events = json["events"];
var mongoEvent = "";
var flowFileList = [];
for(var x = 0; x < json["events"].length; x++){
try{
var newFlowFile = session.create();
mongoEvent = constructJSONEvent(x, json); // Here we will receive our new JSON
outputStream.write(mongoEvent.getBytes(StandardCharsets.UTF_8));
session.transfer(newFlowFile, REL_SUCCESS);
}catch(e){
session.transfer(newFlowFile, REL_FAILURE);
}
}
}));
session.transfer(flowFile, REL_SUCCESS);
} catch(e) {
session.transfer(flowFile, REL_FAILURE);
}
}

Related

Downloading files in Blazor Webassembly when using navigation manager and needing to check user claims

I have a Blazor WASM solution in which I am trying to build functionality for downloading data from a (Radzen) datagrid. I start off by calling an export service from the code of my component like so:
await _exportService.Export("expense-claims-admin", type, new Query()
{
OrderBy = grid.Query.OrderBy,
Filter = grid.Query.Filter,
Select = "CreateDate,User.FirstName,User.LastName,Status,ExpenseRecords.Count AS NumberOfExpenseRecords,MileageRecords.Count AS NumberOfMileageRecords,TotalAmount"
});
My export service then runs the following code:
public async Task Export(string table, string type, Query query = null)
{
_navigationManager.NavigateTo(query != null ? query.ToUrl($"/export/{table}/{type}") : $"/export/{table}/{type}", true);
}
This calls a controller in my server project, which runs a different action depending on the request URL. For example:
[HttpGet("/export/expense-claims-admin/excel")]
public IActionResult ExportExpenseClaimsToExcelAdmin()
{
var claimsPrincipal = User;
var companyId = claimsPrincipal.FindFirst(c => c.Type == "companyId");
if (companyId != null)
{
return ToExcel(ApplyQuery(context.expense_claims.Where(x => x.DeleteDate == null && x.CompanyId == int.Parse(companyId.Value)), Request.Query));
}
//TODO - Return something other than null
return null!;
}
This then calls one of two methods depending on whether I'm trying to export to Excel or CSV. Below is the ToExcel method that gets called above:
public FileStreamResult ToExcel(IQueryable query, string fileName = null)
{
var columns = GetProperties(query.ElementType);
var stream = new MemoryStream();
using (var document = SpreadsheetDocument.Create(stream, SpreadsheetDocumentType.Workbook))
{
var workbookPart = document.AddWorkbookPart();
workbookPart.Workbook = new Workbook();
var worksheetPart = workbookPart.AddNewPart<WorksheetPart>();
worksheetPart.Worksheet = new Worksheet();
var workbookStylesPart = workbookPart.AddNewPart<WorkbookStylesPart>();
GenerateWorkbookStylesPartContent(workbookStylesPart);
var sheets = workbookPart.Workbook.AppendChild(new Sheets());
var sheet = new Sheet() { Id = workbookPart.GetIdOfPart(worksheetPart), SheetId = 1, Name = "Sheet1" };
sheets.Append(sheet);
workbookPart.Workbook.Save();
var sheetData = worksheetPart.Worksheet.AppendChild(new SheetData());
var headerRow = new Row();
foreach (var column in columns)
{
headerRow.Append(new Cell()
{
CellValue = new CellValue(column.Key),
DataType = new EnumValue<CellValues>(CellValues.String)
});
}
sheetData.AppendChild(headerRow);
foreach (var item in query)
{
var row = new Row();
foreach (var column in columns)
{
var value = GetValue(item, column.Key);
var stringValue = $"{value}".Trim();
var cell = new Cell();
var underlyingType = column.Value.IsGenericType &&
column.Value.GetGenericTypeDefinition() == typeof(Nullable<>) ?
Nullable.GetUnderlyingType(column.Value) : column.Value;
var typeCode = Type.GetTypeCode(underlyingType);
if (typeCode == TypeCode.DateTime)
{
if (!string.IsNullOrWhiteSpace(stringValue))
{
cell.CellValue = new CellValue() { Text = ((DateTime)value).ToOADate().ToString(System.Globalization.CultureInfo.InvariantCulture) };
cell.DataType = new EnumValue<CellValues>(CellValues.Number);
cell.StyleIndex = (UInt32Value)1U;
}
}
else if (typeCode == TypeCode.Boolean)
{
cell.CellValue = new CellValue(stringValue.ToLower());
cell.DataType = new EnumValue<CellValues>(CellValues.Boolean);
}
else if (IsNumeric(typeCode))
{
if (value != null)
{
stringValue = Convert.ToString(value, CultureInfo.InvariantCulture);
}
cell.CellValue = new CellValue(stringValue);
cell.DataType = new EnumValue<CellValues>(CellValues.Number);
}
else
{
cell.CellValue = new CellValue(stringValue);
cell.DataType = new EnumValue<CellValues>(CellValues.String);
}
row.Append(cell);
}
sheetData.AppendChild(row);
}
workbookPart.Workbook.Save();
}
if (stream?.Length > 0)
{
stream.Seek(0, SeekOrigin.Begin);
}
var result = new FileStreamResult(stream, "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet");
result.FileDownloadName = (!string.IsNullOrEmpty(fileName) ? fileName : "Export") + ".xlsx";
return result;
}
The issue I'm hitting is that the claims against the User object are never populated with the information that I need to do the checks in my export controller. I know this is because I'm calling the controller with navigation manager rather than going through a httpclient request so no authorisation token is passed, but if I use httpclient instead, the file is not downloaded by the browser.
I have tried using a combination of httpclient and javascript to get the file to download (code below) but this is returning a text file rather than an xlsx or csv file and the content is just gibberish (to me anyway).
Here's the code in my service if I use httpclient instead of navigation manager:
public async Task Export(string table, string type, Query query = null)
{
var response = await _client.GetAsync(query != null ? query.ToUrl($"/export/{table}/{type}") : $"/export/{table}/{type}");
var fileStream = await response.Content.ReadAsStreamAsync();
using var streamRef = new DotNetStreamReference(stream: fileStream);
await _js.InvokeVoidAsync("downloadFileFromStream", "Export", streamRef);
}
and here's the javascript code I'm using to download the file:
async function downloadFileFromStream(fileName, contentStreamReference) {
const arrayBuffer = await contentStreamReference.arrayBuffer();
const blob = new Blob([arrayBuffer]);
const url = URL.createObjectURL(blob);
triggerFileDownload(fileName, url);
URL.revokeObjectURL(url);
}
function triggerFileDownload(fileName, url) {
const anchorElement = document.createElement("a");
anchorElement.href = url;
if (fileName) {
anchorElement.download = fileName;
}
anchorElement.click();
anchorElement.remove();
}
I've read that using jsinterop to do file downloads is slow and that you're also limited by file size, so it seems like the best way to get this working would be to call the controller with navigation manager, but I just can't work out how to get those user claims if I do that and I really need to check those to make sure I'm returning the right data.

GAS how upload multiple file in google drive

I'm trying to upload multiple files into Google Drive Using Google Apps Script.
My code work fine when I want to upload one file
// UPLOAD IMG IN GOOGLE DRIVE
var url = 'http://www.pngall.com/wp-content/uploads/2/1-Number-PNG-Picture.png';
var response = UrlFetchApp.fetch(url); // get api endpoint
var rc = response.getResponseCode();
if(rc=200){
var fileBlob = response.getBlob();
var folder = DriveApp.getFolderById("xxxxxx")
if(folder !=null) {
var file_img = folder.createFile(fileBlob)
var img = file_img.getUrl();
}
}
} else {
var img = "";
}
// APPEND VALUE TO SHEET
sheet.appendRow([img]);
I'm trying to modify the above script in order to upload multiple files into google drive, but my code doesn't works.
This is my (not working) code:
// UPLOAD IMG IN GOOGLE DRIVE
var url = ['http://www.pngall.com/wp-content/uploads/2/1-Number-PNG-Picture.png', 'https://www.yourcloudworks.com/wp-content/uploads/2019/09/number-digit-2-png-transparent-images-transparent-backgrounds-Number-2-PNG-images-free-download_PNG14949.png'];
for(var i=0; i<url.length; i++){
var response = UrlFetchApp.fetchAll(url);
var rc = response.getResponseCode();
if(rc=200){
var fileBlob = response.getBlob();
var folder = DriveApp.getFolderById("xxxxxx")
if(folder !=null) {
var file_img = folder.createFile(fileBlob[i])
var img = file_img.getUrl()[i];
}
}
} else {
var img = "";
}
// APPEND VALUE TO SHEET
sheet.appendRow(img[i]);
}
TypeError: response.getResponseCode is not a function
Any help?
Modification points:
In your script, for(var i=0; i<url.length; i++){}else{} is used. I thought that you might misunderstand the if statement and for loop.
When you want to compare the value at the if statement, please modify if(rc=200){ to if(rc==200){.
The response value from UrlFetchApp.fetchAll(url) is an array.
I think that the reason of the error message is this.
folder.createFile(fileBlob[i]) is folder.createFile(fileBlob).
file_img.getUrl()[i] is file_img.getUrl().
When file_img.setTrashed(true) is used, the downloaded files are moved to the trashbox. If you don't want to move them to the trashbox, please remove the line.
I think that when the values are put to the Spreadsheet by one request, the process cost will be low. In your script, I would like to propose to use setValues instead of appendRow.
When above points are reflected to your script, it becomes as follows.
Modified script:
Please copy and paste the following modified script. And please set the variable of sheet, and the folder ID.
function myFunction() {
// var sheet = SpreadsheetApp.getActiveSheet();
var url = ['http://www.pngall.com/wp-content/uploads/2/1-Number-PNG-Picture.png', 'https://www.yourcloudworks.com/wp-content/uploads/2019/09/number-digit-2-png-transparent-images-transparent-backgrounds-Number-2-PNG-images-free-download_PNG14949.png'];
var requests = url.map(u => ({url: u, muteHttpExceptions: true}));
var response = UrlFetchApp.fetchAll(requests);
var imgs = [];
for (var i = 0; i < response.length; i++) {
if (response[i].getResponseCode() == 200) {
var fileBlob = response[i].getBlob();
var folder = DriveApp.getFolderById("xxxxxx");
if (folder != null) {
var file_img = folder.createFile(fileBlob);
imgs.push([file_img.getUrl()]);
// file_img.setTrashed(true); // When this script is used, the downloaded files are moved to the trashbox.
}
}
}
if (imgs.length > 0) {
sheet.getRange(sheet.getLastRow() + 1, 1, imgs.length).setValues(imgs);
}
}
When muteHttpExceptions: true is used, the script can be run even when an error occurs.
References:
fetchAll(requests)
if...else
Loops and iteration

How to send a file using glideajax?

I've been trying to get this to work for a while now. I have a UI action that includes a UI page through a GlideDialog. The UI page is just a form with a bunch of input (text type) and one file type. On click of submit button I am sending the form data as well as the file attachment via glideAjax,
var issueObj = {};
var ga = new GlideAjax(glideAjax);
var name = $j_jb('#name').val();
var address = $j_jb('#address').val();
var file = $j_jb('#jira_attachment')[0].files[0];
issueObj.name = name;
issueObj.address = address;
var IssueObjString = JSON.stringify(issueObj);
ga.addParam('sysparm_name','createIssue');
ga.addParam('sysparm_issueObj', IssueObjString);
ga.addParam('sysparm_attachment', file);
var that = this;
ga.getXML(function (response) {
var responseStatus = response.responseXML.documentElement.getAttribute("answer");
var DOMData = "";
if(responseStatus) {
that.displayMessage(jiraAlert['success-insertion']);
}
else {
that.displayMessage(jiraAlert['error-insertion']);
}
});
I have the corresponding script include method it calls here,
createIssue: function() {
var issueObj = this.getParameter("sysparm_issueObj");
var fileAttachment = this.getParameter("sysparm_attachment");
issueObj = JSON.parse(issueObj);
var fileName = issueObj.fileAttachment.name;
var fileType = issueObj.fileAttachment.type;*/
var gr = new GlideRecord('sample_table');
gr.newRecord();
gr.name = issueObj.name;
gr.address = issueObj.address;
insertRef = gr.insert();
var ga = new GlideSysAttachment();
ga.write(gr, fileAttachment.name, fileAttachment.type, fileAttachment);
}
The record gets generated by the attachment is corrupt,
I've hit a wall here, and don't know how to proceed further. Any help with this regard is highly appreciated!
Thanks,
Raskill
We cant send files directly, like how we can able to do it on php etc.
On ServiceNow you can use OOB widget Ticket Attachments widget-ticket-attachments
The alternative is you need to convert the file to base 64 and send that data to the server script and use.
Here is the code that I used in one place:
<label class="file-upload btn btn-primary">
${Browse for file} ...
<input type="file" id="fileToUpload" onchange="angular.element(this).scope().setFiles(this)"/>
</label>
Client:
$scope.setFiles = function(element) {
$scope.resumefiles = [];
$scope.$apply(function() {
// Turn the FileList object into an Array
for (var i = 0; i < element.files.length; i++) {
$scope.resumefiles.push(element.files[i]);
}
$scope.uploadResume($scope.resumefiles);
});
};
$scope.uploadResume = function(resumefiles){
var reader = new FileReader(), base64String = "";
reader.onloadend = function () {
base64String = reader.result.substr(reader.result.indexOf("base64,"),
reader.result.length-1).replace("base64,","");
$scope.data.fileData = base64String;
$scope.data.fileName = resumefiles[0].name;
$scope.data.fileType = resumefiles[0].type;
$scope.data.funcName = 'updateApplicantResume';
c.server.update().then(function(){
$scope.data.funcName = '';
$scope.resumefiles = [];
})
}
reader.readAsDataURL(resumefiles[0]);
}
Server script:
uploadAttachment: function(input) {
var attachment_sys_id = '';
var attachment = new GlideSysAttachment();
var gr = new GlideRecordSecure(input.table);
if (gr.get(input.applicantSysId)) {
attachment_sys_id = attachment.writeBase64(gr, input.fileName, input.fileType, input.fileData);
}
return attachment_sys_id;
},

Nodejs non-blocking and Event Emitter. Refresh URL

fist of all im not shure if the following is a non-blocking problem?
im getting started with https://github.com/sahat/hackathon-starter
currently i try to read all files out of a folder and later process all files...
i used EventEmitter to kind of manage the workflow.
i want to clear all arrays if the URL is refeshed or loaded new, but somehow if i reaload the URL there seems to be something inside the arrays which cases multiple outputs of the same data?
at the moment i just would be happy to have a correct console.log output.
/**
* GET /
* Home page.
*/
var fs = require('fs');
//XML
var jsxml = require("node-jsxml");
var Namespace = jsxml.Namespace,
QName = jsxml.QName,
XML = jsxml.XML,
XMLList = jsxml.XMLList;
//EventEmitter
var EventEmitter=require('events').EventEmitter;
var dateinamenEE=new EventEmitter();
var dateiinhaltEE=new EventEmitter();
var dateinamen = [];
var dateiinhalt = [];
exports.index = function(req, res) {
fs.readdir('./data', function (err, files) {
if (!err) {
files.forEach(function(value) {
dateinamen.push(value);
});
dateinamenEE.emit('dateinamen_ready');
} else {
throw err;
}
});
dateinamenEE.on('dateinamen_ready',function(){
dateinamen.forEach(function(value) {
var buf = fs.readFileSync('./data/'+value, "utf8");
var xml = new XML(buf);
var list = xml.descendants("suggestion");
var ergebnis = "";
var basiswort = "";
var buchstabe = "";
var obj = null;
list.each(function(item, index){
ergebnis = item.attribute('data').toString()
//basiswort = value.replace("%2B", " ");
//basiswort = basiswort.replace(".xml", "");
//var pieces = buchstabe.split(" ");
obj = {k: basiswort, b: buchstabe, e: ergebnis};
dateiinhalt.push(obj);
});
});
dateiinhaltEE.emit('dateiinhalt_ready');
});
dateiinhaltEE.on('dateiinhalt_ready',function(){
//console.log(dateiinhalt);
console.log("dateinamen:" + dateinamen.length);
console.log("dateiinhalt:" + dateiinhalt.length);
});
res.render('home', {
title: 'Home'
});
};
If if log the length of the 2 arrays the output on the second reload shows. First time loading the url:
Express server listening on port 3000 in development mode
dateinamen:2
dateiinhalt:20
Second time / refreshing the url:
GET / 200 898.198 ms - -
GET /fonts/fontawesome-webfont.woff2?v=4.3.0 304 12.991 ms - -
GET /favicon.ico 200 4.516 ms - -
dateinamen:4
dateiinhalt:60
dateinamen:4
dateiinhalt:60
dateinamen:4
dateiinhalt:100
dateinamen:4
dateiinhalt:100
GET / 200 139.259 ms - -
What causes the code to extend the arrays while reloading the page?
The non-blocking problem is due do your for(...) loops.
Changing them by : array.forEach(function(elem, index){});
EDIT
The arrays should be initialized inside the index function :
exports.index = function(req, res) {
var dateinamen = [];
var dateiinhalt = [];
...
Also, I'm not sure you need the use of EventEmitter.
Something like
`
fs.readdir('./data', function (err, files) {
if (!err) {
files.forEach(function(file){
var buf = fs.readFileSync('./data/'+file, "utf8");
var xml = new XML(buf);
var list = xml.descendants("suggestion");
var ergebnis = null;
var obj = null;
list.each(function(item, index){
ergebnis = item.attribute('data').toString();
obj = {k: file, v: ergebnis};
dateiinhalt.push(obj);
});
});
console.log(dateiinhalt);
} else {
throw err;
}
});
`
could do the job no?
(I wanted to say this as a comment, but I'm still missing reputation)

Is there a way of Creating lnk file using javascript

I would like to give the users in my website the ability to download a "lnk" file.
My idea is to generate this file with to contain an address that can be used only once.
Is there a way to generate this file in javascript?
The flow is something like -
the user presses a button
the javascript generates this file and downloads it to the user's machine
the user sends this file to another user to use this one-time-address from his machine
Is something like this is doable in javascript from the client side? or would i need to generate this file using java server side?
This is a faithful translation of mslink.sh.
I only tested my answer in Windows 8.1, but I would think that it works in older versions of Windows, too.
function create_lnk_blob(lnk_target) {
function hex_to_arr(s) {
var result = Array(s.length / 2);
for (var i = 0; i < result.length; ++i) {
result[i] = +('0x' + s.substr(2*i, 2));
}
return result;
}
function str_to_arr(s) {
var result = Array(s.length);
for (var i = 0; i < s.length; ++i) {
var c = s.charCodeAt(i);
if (c >= 128) {
throw Error("Only ASCII paths are suppored :-(");
}
result[i] = c;
}
return result;
}
function convert_CLSID_to_DATA(s) {
var idx = [[6,2], [4,2], [2,2], [0,2],
[11,2], [9,2], [16,2], [14,2],
[19,4], [24,12]];
var s = idx.map(function (ii) {
return s.substr(ii[0], ii[1]);
});
return hex_to_arr(s.join(''));
}
function gen_IDLIST(s) {
var item_size = (0x10000 + s.length + 2).toString(16).substr(1);
return hex_to_arr(item_size.replace(/(..)(..)/, '$2$1')).concat(s);
}
var HeaderSize = [0x4c, 0x00,0x00,0x00],
LinkCLSID = convert_CLSID_to_DATA("00021401-0000-0000-c000-000000000046"),
LinkFlags = [0x01,0x01,0x00,0x00], // HasLinkTargetIDList ForceNoLinkInfo
FileAttributes_Directory = [0x10,0x00,0x00,0x00],
FileAttributes_File = [0x20,0x00,0x00,0x00],
CreationTime = [0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00],
AccessTime = [0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00],
WriteTime = [0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00],
FileSize = [0x00,0x00,0x00,0x00],
IconIndex = [0x00,0x00,0x00,0x00],
ShowCommand = [0x01,0x00,0x00,0x00], //SW_SHOWNORMAL
Hotkey = [0x00,0x00], // No Hotkey
Reserved = [0x00,0x00],
Reserved2 = [0x00,0x00,0x00,0x00],
Reserved3 = [0x00,0x00,0x00,0x00],
TerminalID = [0x00,0x00],
CLSID_Computer = convert_CLSID_to_DATA("20d04fe0-3aea-1069-a2d8-08002b30309d"),
CLSID_Network = convert_CLSID_to_DATA("208d2c60-3aea-1069-a2d7-08002b30309d"),
PREFIX_LOCAL_ROOT = [0x2f],
PREFIX_FOLDER = [0x31,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00],
PREFIX_FILE = [0x32,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00],
PREFIX_NETWORK_ROOT = [0xc3,0x01,0x81],
PREFIX_NETWORK_PRINTER = [0xc3,0x02,0xc1],
END_OF_STRING = [0x00];
if (/.*\\+$/.test(lnk_target)) {
lnk_target = lnk_target.replace(/\\+$/g, '');
var target_is_folder = true;
}
var prefix_root, item_data, target_root, target_leaf;
if (lnk_target.substr(0, 2) === '\\\\') {
prefix_root = PREFIX_NETWORK_ROOT;
item_data = [0x1f, 0x58].concat(CLSID_Network);
target_root = lnk_target.subtr(lnk_target.lastIndexOf('\\'));
if (/\\\\.*\\.*/.test(lnk_target)) {
target_leaf = lnk_target.substr(lnk_target.lastIndexOf('\\') + 1);
}
if (target_root === '\\') {
target_root = lnk_target;
}
} else {
prefix_root = PREFIX_LOCAL_ROOT;
item_data = [0x1f, 0x50].concat(CLSID_Computer);
target_root = lnk_target.replace(/\\.*$/, '\\');
if (/.*\\.*/.test(lnk_target)) {
target_leaf = lnk_target.replace(/^.*?\\/, '');
}
}
var prefix_of_target, file_attributes;
if (!target_is_folder) {
prefix_of_target = PREFIX_FILE;
file_attributes = FileAttributes_File;
} else {
prefix_of_target = PREFIX_FOLDER;
file_attributes = FileAttributes_Directory;
}
target_root = str_to_arr(target_root);
for (var i = 1; i <= 21; ++i) {
target_root.push(0);
}
var id_list_items = gen_IDLIST(item_data);
id_list_items = id_list_items.concat(
gen_IDLIST(prefix_root.concat(target_root, END_OF_STRING)));
if (target_leaf) {
target_leaf = str_to_arr(target_leaf);
id_list_items = id_list_items.concat(
gen_IDLIST(prefix_of_target.concat(target_leaf, END_OF_STRING)));
}
var id_list = gen_IDLIST(id_list_items);
var data = [].concat(HeaderSize,
LinkCLSID,
LinkFlags,
file_attributes,
CreationTime,
AccessTime,
WriteTime,
FileSize,
IconIndex,
ShowCommand,
Hotkey,
Reserved,
Reserved2,
Reserved3,
id_list,
TerminalID);
return new Blob([new Uint8Array(data)], { type: 'application/x-ms-shortcut' });
}
var blob = create_lnk_blob('C:\\Windows\\System32\\Calc.exe');
Use it like:
var blob_to_file = create_lnk_blob('C:\\Windows\\System32\\Calc.exe');
var blob_to_folder = create_lnk_blob('C:\\Users\\Myself\\Desktop\\'); // with a trailing slash
Demo: http://jsfiddle.net/5cjgLyan/2/
This would be simple if your website allows php.
If your script is part of an html file, just write the the javascript as if you were writing it to send a static lnk file. Then, at the lnk address part, break apart the javascript into two parts, breaking into html. Then at that point, put in
<?php /*PHP code set a variable *? /* PHP code to generate proper string*/ PRINT /*PHP variable*/
?>
I think make it pure client is impossible.
Even the web rtc protocol need at least one iceServer to signal other client.
And I think the easiest way to do that is use http://peerjs.com/
you could first create a clinet token of the room owner
//room owner side
peer.on('open', function(my_peer_id) {
console.log('My peer ID is: ' + my_peer_id);
});
And send the token to any other you want (by text file, web chat ...etc)
Then other connect it use the token above
//the other one
var conn = peer.connect(other_peer_id);
After the room owner detected someone entered the room.
Disconnect from signal server, so the token will become unusable
//room owner side
peer.disconnect()
About generate and read file by client side, I recommend you read article below.
http://www.html5rocks.com/en/tutorials/file/dndfiles/ read from file
How to use filesaver.js save as file
I believe the compatibility of fileReader api and blob doesn't matter.
Since there will never be a browser which support webrtc but not support fileReader api

Categories

Resources