Gmail API - Parse message content (Base64 decoding?) with Javascript - javascript

I'm trying to use the Gmail API to get a user's email, grab the message subject and body, and then display it on a webpage. I'll be doing other stuff with it, but this is the part that I am having difficulty with. I am using Angular.js.
Here is my API call:
function makeApiCall() {
gapi.client.load('gmail', 'v1', function() {
var request = gapi.client.gmail.users.messages.list({
labelIds: ['INBOX']
});
request.execute(function(resp) {
var content = document.getElementById("message-list");
angular.forEach(resp, function(message) {
var email = gapi.client.gmail.users.messages.get({'id': message.id});
// var raw = email.payload.parts;
// console.log(raw);
content.innerHTML += JSON.stringify(email) + "<br>";
})
});
});
}
So gapi.client.gmail.users.messages.list returns an array of my messages, with their ID numbers. That is working.
The call to gapi.client.gmail.users.messages.get({<specific message ID>}) outputs this - {"B":{"method":"gmail.users.messages.get","rpcParams":{},"transport":{"name":"googleapis"}}}.
Not sure what that is, but trying to get the message payload (email.payload.parts), results in undefined. So, how can I get the message content?
Also, I would assume that if I can get the message contents, I would then have to Base64 decode the contents to get some English out of it. Any suggestions for that would be of great help also. I've found this: https://github.com/kvz/phpjs, but since I'm not sure how to go about getting the message contents so that I can try and decode them, so not sure if that php.js is of an help in that regard.

Regarding the Base64 decoding, you can use
atob(dataToDecode)
For Gmail, you'll also want to replace some characters:
atob( dataToDecode.replace(/-/g, '+').replace(/_/g, '/') );
The above function is available to you in JavaScript (see ref). I use it myself to decode the Gmail messages. No need to install extra stuff. As an interesting tangent, if you want to encode your message to Base64, use btoa.
Now, for accessing your message payload, you can write a function:
var extractField = function(json, fieldName) {
return json.payload.headers.filter(function(header) {
return header.name === fieldName;
})[0].value;
};
var date = extractField(response, "Date");
var subject = extractField(response, "Subject");
referenced from my previous SO Question and
var part = message.parts.filter(function(part) {
return part.mimeType == 'text/html';
});
var html = atob(part.body.data.replace(/-/g, '+').replace(/_/g, '/'));

Depending on what your emails look like (single text/plain part? multipart with text/html? attachments, etc?) you may or may not have any "parts" in your email.payload and instead you'll have what you're looking for in "email.payload.body.data" (for single-part messages). This is all assuming you're doing a message.get with the default format ("full"). If you instead want to get the entire email in the message.raw field and deal with it in email libraries for your language you can call message.get(format=raw).
For more info check out the "body" and "parts[]" field documentation for "Message" at https://developers.google.com/gmail/api/v1/reference/users/messages

Ah! I figured it out. parts is an array, so I should have been calling it like: gapi.client.gmail.users.messages.get({'id': <message ID>}).payload.parts[0].body.data
Now my problem is decoding the emails, which is proving successful in plain text emails, but failing in emails from non-personal locations (businesses, social media update emails, etc.). But I'll make a new question to get answers for that.

You need to search where the body for a given mime type is, I have written a recursive function for that:
function searchBodyRec(payload, mimeType){
if (payload.body && payload.body.size && payload.mimeType === mimeType) {
return payload.body.data;
} else if (payload.parts && payload.parts.length) {
return payload.parts.flatMap(function(part){
return searchBodyRec(part, mimeType);
}).filter(function(body){
return body;
});
}
}
So now you can call
var encodedBody = searchBodyRec(this.message.payload, 'text/plain');
See the flatMap method up there? Classic FP method missing in js, here is how to add it (or you can use lodash.js, or underscore.js if you don't want to mess with the native objects)
Array.prototype.flatMap = function(lambda) {
return Array.prototype.concat.apply([], this.map(lambda));
};

Related

Convert Facebook messenger attachment image-type to base64 in node js

I need to write some code that take an image url in Facebook messenger and convert it into image.
However, using image-to-base64 or fetch-base64 will not work, because the facebook payload url has timestamp at the end of it, while those tool require the url to end in .jpg or .png.
Url format:
https://scontent.fhan2-4.fna.fbcdn.net/v/t1.15752-0/p480x480/60251115_627131164420267_474161086648549376_n.png?_nc_cat=100&_nc_oc=AQmldFK_xUgJPT-rqrk4bxLivk8ispusU5THY7br4ZpvNTfcYVrfU-rBFlIX9cwUzaw&_nc_ht=scontent.fhan2-4.fna&oh=140ea3424f8fa6a9085b3ae88281fa51&oe=5D5F4DAD
I have tried using image-to-base64 and fetch-base6, and neither of them works.
const image2base64 = require('image-to-base64');
module.exports = (string) => {
console.log(string);
image2base64(string) // you can also to use url
.then(
(response) => {
console.log(response);
}
)
.catch(
(error) => {
console.log(error);
}
)
}
I would like to get the base64 resul, so I wonder if there is any package that can help me with this task.
I think I got the problem.
For anyone need to get the url of attachment in Facebook messenger, the form of:
message.attachments[0].payload
when represented as String is like this:
{"url":"THE_URL_THAT_YOU_NEED"}
So, to get THE_URL_THAT_YOU_NEED, you have to remove the string {"url":" and "} first, which can be done easily like this
var url= (JSON.stringify(event.message.attachments[0].payload).replace("{\"url\":\"","")).replace("\"}","");
Probably not the best solution, but works for me.

getting crypto.subtle.encrypt to work like CryptoJS.AES.encrypt

I wrote a system that is implemented using CryptoJS.
After writing my code, I discovered crypto.subtle.encrypt which is an AES implementation built into browsers.
I want to change my code away from using CryptoJs and onto using crypto.subtle.encrypt.
Data encoded the old way (CryptoJS) has to be compatible with the new way (crypto.subtle.encrypt).
How can I achieve this?
When I wrote my original code, it looked much like this:
function cryptojs_encrypt(message) {
var key = "my password";
return CryptoJS.AES.encrypt(message, key).toString());
}
Where the "key" passed in is just a string. From what I've been able to read from other stackoverflow questions, CryptoJS converts this string into a "key" and "iv". How exactly is this achieved? I tried looking through the CryptoJS source code but couldn't find what I was looking for.
The way subtle.crypt.encrypt works, is that you have to pass in the key and iv explicitly. Here is my code:
function subtle_encrypt(message) {
var msg = new TextEncoder().encode(message);
var pass = new TextEncoder().encode('my password');
var alg = { name: 'AES-CBC', iv: pass };
crypto.subtle.importKey('raw', pass, alg, false, ['encrypt']).then(function(key){
crypto.subtle.encrypt(alg, key, msg).then(function(ctBuffer){
var string = btoa(ctBuffer);
console.log("result", string);
});
});
}
This works but returns a different result. I need to modify the arguments that go into alg which matches what CryptoJS uses when you pass in a string. How do I do this?
I've created a small library to do just that.
Embed WebCrypto.js (Minified) in your document.
Use it like this:
// Initialize the library
initWebCrypto();
// Encrypt your stuff
WebCrypto.encrypt({
data: btoa("my message"),
password: "my password",
callback: function(response){
if( !response.error ){
console.log(response.result); // Compatible with CryptoJS
}else{
console.error(response.error);
}
}
});
See https://github.com/etienne-martin/WebCrypto.swift/blob/master/www/index.html for more examples.
Source code: https://github.com/etienne-martin/WebCrypto.swift/blob/master/source.js
Hope this helps!

Transferring strings containing emotions to server

On my website whenever a user enters a mobile emoji like 😀 into an input field it will be saved as ?? in my database.
Those emojis are encoded in utf8mb4, so I already updated my database collation to utf8mb4_general_ci.
While the emoticons can be saved successfully now when transfering the message containing a emoji from a client to my server, it still get's somewhere changed into ?? and I am now trying to figure out where and how to solve it.
Sending the message to my server happens in this ajax call:
function updateStatus() {
var status = $("#status").val();
jsRoutes.controllers.Userdata.updateStatus( status ).ajax({
success : function(data) {
$("#profil").cftoaster({content: data});
},
error : function(err) {
$("#profil").cftoaster({content: err.responseText});
}
});
}
On serverside I am using java based Play Framework 2.4.4.
This is the beginning of the method which is called in the ajax call:
public static Result updateStatus(String status) {
String receivedName = Application.getSessionUser();
Logger.debug("RecvStatus: " + status);
...
}
The Logger output already is ?? for an emoticon.
The route looks like this:
PUT /status/ controllers.Userdata.updateStatus(status: String)
EDIT:
To make sure the transfer from client to server is alright I am now transferring the actual unicode values, I change my server function like this
Logger.debug("RecvStatus: " + status);
status = status.replace("\\","");
String[] arr = status.split("u");
status = "";
for(int i = 1; i < arr.length; i++){
int hexVal = Integer.parseInt(arr[i], 16);
status += (char)hexVal;
}
Logger.debug("RecvStatus: " + status);
and get the following output:
[debug] application - RecvStatus: \ud83d\ude01
[debug] application - RecvStatus: ?
which means the problem is probably with java
So for anyone having a similiar problem here my workaround.
First I tried to convert the String in java to base64 and store it this way. Unfortunately after decoding the base64 the unicode information still got lost and ?? was shown instead of emoticons.
What I than did was converting the received String into unicode and than into base64, and when I load the Strings from the database I first decode base64 and then convert the unicode information into an actual String. This way the emoticons are stored and afterwards shown correctly.

How to parse JSON using Node.js? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
The community reviewed whether to reopen this question 2 months ago and left it closed:
Original close reason(s) were not resolved
Improve this question
How should I parse JSON using Node.js? Is there some module which will validate and parse JSON securely?
You can simply use JSON.parse.
The definition of the JSON object is part of the ECMAScript 5 specification. node.js is built on Google Chrome's V8 engine, which adheres to ECMA standard. Therefore, node.js also has a global object JSON[docs].
Note - JSON.parse can tie up the current thread because it is a synchronous method. So if you are planning to parse big JSON objects use a streaming json parser.
you can require .json files.
var parsedJSON = require('./file-name');
For example if you have a config.json file in the same directory as your source code file you would use:
var config = require('./config.json');
or (file extension can be omitted):
var config = require('./config');
note that require is synchronous and only reads the file once, following calls return the result from cache
Also note You should only use this for local files under your absolute control, as it potentially executes any code within the file.
You can use JSON.parse().
You should be able to use the JSON object on any ECMAScript 5 compatible JavaScript implementation. And V8, upon which Node.js is built is one of them.
Note: If you're using a JSON file to store sensitive information (e.g. passwords), that's the wrong way to do it. See how Heroku does it: https://devcenter.heroku.com/articles/config-vars#setting-up-config-vars-for-a-deployed-application. Find out how your platform does it, and use process.env to retrieve the config vars from within the code.
Parsing a string containing JSON data
var str = '{ "name": "John Doe", "age": 42 }';
var obj = JSON.parse(str);
Parsing a file containing JSON data
You'll have to do some file operations with fs module.
Asynchronous version
var fs = require('fs');
fs.readFile('/path/to/file.json', 'utf8', function (err, data) {
if (err) throw err; // we'll not consider error handling for now
var obj = JSON.parse(data);
});
Synchronous version
var fs = require('fs');
var json = JSON.parse(fs.readFileSync('/path/to/file.json', 'utf8'));
You wanna use require? Think again!
You can sometimes use require:
var obj = require('path/to/file.json');
But, I do not recommend this for several reasons:
require is synchronous. If you have a very big JSON file, it will choke your event loop. You really need to use JSON.parse with fs.readFile.
require will read the file only once. Subsequent calls to require for the same file will return a cached copy. Not a good idea if you want to read a .json file that is continuously updated. You could use a hack. But at this point, it's easier to simply use fs.
If your file does not have a .json extension, require will not treat the contents of the file as JSON.
Seriously! Use JSON.parse.
load-json-file module
If you are reading large number of .json files, (and if you are extremely lazy), it becomes annoying to write boilerplate code every time. You can save some characters by using the load-json-file module.
const loadJsonFile = require('load-json-file');
Asynchronous version
loadJsonFile('/path/to/file.json').then(json => {
// `json` contains the parsed object
});
Synchronous version
let obj = loadJsonFile.sync('/path/to/file.json');
Parsing JSON from streams
If the JSON content is streamed over the network, you need to use a streaming JSON parser. Otherwise it will tie up your processor and choke your event loop until JSON content is fully streamed.
There are plenty of packages available in NPM for this. Choose what's best for you.
Error Handling/Security
If you are unsure if whatever that is passed to JSON.parse() is valid JSON, make sure to enclose the call to JSON.parse() inside a try/catch block. A user provided JSON string could crash your application, and could even lead to security holes. Make sure error handling is done if you parse externally-provided JSON.
use the JSON object:
JSON.parse(str);
Another example of JSON.parse :
var fs = require('fs');
var file = __dirname + '/config.json';
fs.readFile(file, 'utf8', function (err, data) {
if (err) {
console.log('Error: ' + err);
return;
}
data = JSON.parse(data);
console.dir(data);
});
I'd like to mention that there are alternatives to the global JSON object.
JSON.parse and JSON.stringify are both synchronous, so if you want to deal with big objects you might want to check out some of the asynchronous JSON modules.
Have a look: https://github.com/joyent/node/wiki/Modules#wiki-parsers-json
Include the node-fs library.
var fs = require("fs");
var file = JSON.parse(fs.readFileSync("./PATH/data.json", "utf8"));
For more info on 'fs' library , refer the documentation at http://nodejs.org/api/fs.html
Since you don't know that your string is actually valid, I would put it first into a try catch. Also since try catch blocks are not optimized by node, i would put the entire thing into another function:
function tryParseJson(str) {
try {
return JSON.parse(str);
} catch (ex) {
return null;
}
}
OR in "async style"
function tryParseJson(str, callback) {
process.nextTick(function () {
try {
callback(null, JSON.parse(str));
} catch (ex) {
callback(ex)
}
})
}
Parsing a JSON stream? Use JSONStream.
var request = require('request')
, JSONStream = require('JSONStream')
request({url: 'http://isaacs.couchone.com/registry/_all_docs'})
.pipe(JSONStream.parse('rows.*'))
.pipe(es.mapSync(function (data) {
return data
}))
https://github.com/dominictarr/JSONStream
Everybody here has told about JSON.parse, so I thought of saying something else. There is a great module Connect with many middleware to make development of apps easier and better. One of the middleware is bodyParser. It parses JSON, html-forms and etc. There is also a specific middleware for JSON parsing only noop.
Take a look at the links above, it might be really helpful to you.
JSON.parse("your string");
That's all.
as other answers here have mentioned, you probably want to either require a local json file that you know is safe and present, like a configuration file:
var objectFromRequire = require('path/to/my/config.json');
or to use the global JSON object to parse a string value into an object:
var stringContainingJson = '\"json that is obtained from somewhere\"';
var objectFromParse = JSON.parse(stringContainingJson);
note that when you require a file the content of that file is evaluated, which introduces a security risk in case it's not a json file but a js file.
here, i've published a demo where you can see both methods and play with them online (the parsing example is in app.js file - then click on the run button and see the result in the terminal):
http://staging1.codefresh.io/labs/api/env/json-parse-example
you can modify the code and see the impact...
Using JSON for your configuration with Node.js? Read this and get your configuration skills over 9000...
Note: People claiming that data = require('./data.json'); is a
security risk and downvoting people's answers with zealous zeal: You're exactly and completely wrong.
Try placing non-JSON in that file... Node will give you an error, exactly like it would if you did the same thing with the much slower and harder to code manual file read and then subsequent JSON.parse(). Please stop spreading misinformation; you're hurting the world, not helping. Node was designed to allow this; it is not a security risk!
Proper applications come in 3+ layers of configuration:
Server/Container config
Application config
(optional) Tenant/Community/Organization config
User config
Most developers treat their server and app config as if it can change. It can't. You can layer changes from higher layers on top of each other, but you're modifying base requirements. Some things need to exist! Make your config act like it's immutable, because some of it basically is, just like your source code.
Failing to see that lots of your stuff isn't going to change after startup leads to anti-patterns like littering your config loading with try/catch blocks, and pretending you can continue without your properly setup application. You can't. If you can, that belongs in the community/user config layer, not the server/app config layer. You're just doing it wrong. The optional stuff should be layered on top when the application finishes it's bootstrap.
Stop banging your head against the wall: Your config should be ultra simple.
Take a look at how easy it is to setup something as complex as a protocol-agnostic and datasource-agnostic service framework using a simple json config file and simple app.js file...
container-config.js...
{
"service": {
"type" : "http",
"name" : "login",
"port" : 8085
},
"data": {
"type" : "mysql",
"host" : "localhost",
"user" : "notRoot",
"pass" : "oober1337",
"name" : "connect"
}
}
index.js... (the engine that powers everything)
var config = require('./container-config.json'); // Get our service configuration.
var data = require(config.data.type); // Load our data source plugin ('npm install mysql' for mysql).
var service = require(config.service.type); // Load our service plugin ('http' is built-in to node).
var processor = require('./app.js'); // Load our processor (the code you write).
var connection = data.createConnection({ host: config.data.host, user: config.data.user, password: config.data.pass, database: config.data.name });
var server = service.createServer(processor);
connection.connect();
server.listen(config.service.port, function() { console.log("%s service listening on port %s", config.service.type, config.service.port); });
app.js... (the code that powers your protocol-agnostic and data-source agnostic service)
module.exports = function(request, response){
response.end('Responding to: ' + request.url);
}
Using this pattern, you can now load community and user config stuff on top of your booted app, dev ops is ready to shove your work into a container and scale it. You're read for multitenant. Userland is isolated. You can now separate the concerns of which service protocol you're using, which database type you're using, and just focus on writing good code.
Because you're using layers, you can rely on a single source of truth for everything, at any time (the layered config object), and avoid error checks at every step, worrying about "oh crap, how am I going to make this work without proper config?!?".
If you need to parse JSON with Node.js in a secure way (aka: the user can input data, or a public API) I would suggest using secure-json-parse.
The usage is like the default JSON.parse but it will protect your code from:
prototype poisoning
and constructor abuse:
const badJson = '{ "a": 5, "b": 6, "__proto__": { "x": 7 }, "constructor": {"prototype": {"bar": "baz"} } }'
const infected = JSON.parse(badJson)
console.log(infected.x) // print undefined
const x = Object.assign({}, infected)
console.log(x.x) // print 7
const sjson = require('secure-json-parse')
console.log(sjson.parse(badJson)) // it will throw by default, you can ignore malicious data also
My solution:
var fs = require('fs');
var file = __dirname + '/config.json';
fs.readFile(file, 'utf8', function (err, data) {
if (err) {
console.log('Error: ' + err);
return;
}
data = JSON.parse(data);
console.dir(data);
});
Just want to complete the answer (as I struggled with it for a while), want to show how to access the json information, this example shows accessing Json Array:
var request = require('request');
request('https://server/run?oper=get_groups_joined_by_user_id&user_id=5111298845048832', function (error, response, body) {
if (!error && response.statusCode == 200) {
var jsonArr = JSON.parse(body);
console.log(jsonArr);
console.log("group id:" + jsonArr[0].id);
}
})
Just to make this as complicated as possible, and bring in as many packages as possible...
const fs = require('fs');
const bluebird = require('bluebird');
const _ = require('lodash');
const readTextFile = _.partial(bluebird.promisify(fs.readFile), _, {encoding:'utf8',flag:'r'});
const readJsonFile = filename => readTextFile(filename).then(JSON.parse);
This lets you do:
var dataPromise = readJsonFile("foo.json");
dataPromise.then(console.log);
Or if you're using async/await:
let data = await readJsonFile("foo.json");
The advantage over just using readFileSync is that your Node server can process other requests while the file is being read off disk.
JSON.parse will not ensure safety of json string you are parsing. You should look at a library like json-safe-parse or a similar library.
From json-safe-parse npm page:
JSON.parse is great, but it has one serious flaw in the context of JavaScript: it allows you to override inherited properties. This can become an issue if you are parsing JSON from an untrusted source (eg: a user), and calling functions on it you would expect to exist.
Leverage Lodash's attempt function to return an error object, which you can handle with the isError function.
// Returns an error object on failure
function parseJSON(jsonString) {
return _.attempt(JSON.parse.bind(null, jsonString));
}
// Example Usage
var goodJson = '{"id":123}';
var badJson = '{id:123}';
var goodResult = parseJSON(goodJson);
var badResult = parseJSON(badJson);
if (_.isError(goodResult)) {
console.log('goodResult: handle error');
} else {
console.log('goodResult: continue processing');
}
// > goodResult: continue processing
if (_.isError(badResult)) {
console.log('badResult: handle error');
} else {
console.log('badResult: continue processing');
}
// > badResult: handle error
Always be sure to use JSON.parse in try catch block as node always throw an Unexpected Error if you have some corrupted data in your json so use this code instead of simple JSON.Parse
try{
JSON.parse(data)
}
catch(e){
throw new Error("data is corrupted")
}
As mentioned in the above answers, We can use JSON.parse() to parse the strings to JSON
But before parsing, be sure to parse the correct data or else it might bring your whole application down
it is safe to use it like this
let parsedObj = {}
try {
parsedObj = JSON.parse(data);
} catch(e) {
console.log("Cannot parse because data is not is proper json format")
}
Use JSON.parse(str);. Read more about it here.
Here are some examples:
var jsonStr = '{"result":true, "count":42}';
obj = JSON.parse(jsonStr);
console.log(obj.count); // expected output: 42
console.log(obj.result); // expected output: true
If you want to add some comments in your JSON and allow trailing commas you might want use below implemention:
var fs = require('fs');
var data = parseJsData('./message.json');
console.log('[INFO] data:', data);
function parseJsData(filename) {
var json = fs.readFileSync(filename, 'utf8')
.replace(/\s*\/\/.+/g, '')
.replace(/,(\s*\})/g, '}')
;
return JSON.parse(json);
}
Note that it might not work well if you have something like "abc": "foo // bar" in your JSON. So YMMV.
If the JSON source file is pretty big, may want to consider the asynchronous route via native async / await approach with Node.js 8.0 as follows
const fs = require('fs')
const fsReadFile = (fileName) => {
fileName = `${__dirname}/${fileName}`
return new Promise((resolve, reject) => {
fs.readFile(fileName, 'utf8', (error, data) => {
if (!error && data) {
resolve(data)
} else {
reject(error);
}
});
})
}
async function parseJSON(fileName) {
try {
return JSON.parse(await fsReadFile(fileName));
} catch (err) {
return { Error: `Something has gone wrong: ${err}` };
}
}
parseJSON('veryBigFile.json')
.then(res => console.log(res))
.catch(err => console.log(err))
I use fs-extra. I like it a lot because -although it supports callbacks- it also supports Promises. So it just enables me to write my code in a much more readable way:
const fs = require('fs-extra');
fs.readJson("path/to/foo.json").then(obj => {
//Do dome stuff with obj
})
.catch(err => {
console.error(err);
});
It also has many useful methods which do not come along with the standard fs module and, on top of that, it also bridges the methods from the native fs module and promisifies them.
NOTE: You can still use the native Node.js methods. They are promisified and copied over to fs-extra. See notes on fs.read() & fs.write()
So it's basically all advantages. I hope others find this useful.
You can use JSON.parse() (which is a built in function that will probably force you to wrap it with try-catch statements).
Or use some JSON parsing npm library, something like json-parse-or
Use this to be on the safe side
var data = JSON.parse(Buffer.concat(arr).toString());
NodeJs is a JavaScript based server, so you can do the way you do that in pure JavaScript...
Imagine you have this Json in NodeJs...
var details = '{ "name": "Alireza Dezfoolian", "netWorth": "$0" }';
var obj = JSON.parse(details);
And you can do above to get a parsed version of your json...
No further modules need to be required.
Just use
var parsedObj = JSON.parse(yourObj);
I don think there is any security issues regarding this
It's simple, you can convert JSON to string using JSON.stringify(json_obj), and convert string to JSON using JSON.parse("your json string").

Parameter retrieval for HTTP PUT requests under IIS5.1 and ASP-classic?

I'm trying to implement a REST interface under IIS5.1/ASP-classic (XP-Pro development box). So far, I cannot find the incantation required to retrieve request content variables under the PUT HTTP method.
With a request like:
PUT http://localhost/rest/default.asp?/record/1336
Department=Sales&Name=Jonathan%20Doe%203548
how do I read Department and Name values into my ASP code?
Request.Form appears to only support POST requests. Request.ServerVariables only gets me to header information. Request.QueryString doesn't get me to the content either...
Based on the replies from AnthonyWJones and ars I went down the BinaryRead path and came up with the first attempt below:
var byteCount = Request.TotalBytes;
var binContent = Request.BinaryRead(byteCount);
var myBinary = '';
var rst = Server.CreateObject('ADODB.Recordset');
rst.Fields.Append('myBinary', 201, byteCount);
rst.Open();
rst.AddNew();
rst('myBinary').AppendChunk(binContent);
rst.update();
var binaryString = rst('myBinary');
var contentString = binaryString.Value;
var parameters = {};
var pairs = HtmlDecode(contentString).split(/&/);
for(var pair in pairs) {
var param = pairs[pair].split(/=/);
parameters[param[0]] = decodeURI(param[1]);
}
This blog post by David Wang, and an HtmlDecode() function taken from Andy Oakley at blogs.msdn.com, also helped a lot.
Doing this splitting and escaping by hand, I'm sure there are a 1001 bugs in here but at least I'm moving again. Thanks.
Unfortunately ASP predates the REST concept by quite some years.
If you are going RESTFull then I would consider not using url encoded form data. Use XML instead. You will be able to accept an XML entity body with:-
Dim xml : Set xml = CreateObject("MSXML2.DOMDocument.3.0")
xml.async = false
xml.Load Request
Otherwise you will need to use BinaryRead on the Request object and then laboriously convert the byte array to text then parse the url encoding yourself along with decoding the escape sequences.
Try using the BinaryRead method in the Request object:
http://www.w3schools.com/ASP/met_binaryread.asp
Other options are to write an ASP server component or ISAPI filter:
http://www.codeproject.com/KB/asp/cookie.aspx

Categories

Resources