Javascript - Querying VoltDB api with fetch - javascript

I'm trying to query a VoltDB using its api:
const url = 'http://server:8080/api/1.0/'
const queryParam = encodeURIComponent('select * from table')
const queryURL = url + `?Procedure=#AdHoc&Parameters=['${queryParam}']&jsonp=console.log`
fetch(queryURL).then( response => {
response.text().then( text => console.log(text) )
})
With that code throws an "No Access-Control-Allow-Origin" error.
If I change the fetch call to this:
fetch(queryURL, { mode: 'no-cors').then( response => {
response.text().then( text => console.log(text) )
})
It does nothing

This is a browser security feature. If you are serving a web page from one url and within the page you have embedded url calls to another host or port, then the browser won't allow this.
One way to get around this is to add a proxy to your web server, so it can make the calls to port 8080, and pass the responses back to the web page from the same origin.
You may see some answers on Stack Overflow about using CORS to get around this error, but that requires changing the headers that VoltDB uses on port 8080, so that's not something you can do yourself, and we have no plans to do that.
Another solution is to use the voltdb.js file provided in some of our demos, such as the NBBO demo dashboard: https://github.com/VoltDB/voltdb/tree/master/examples/nbbo/web
I think this uses low-level javascript to open a socket to make the HTTP call without using XMLHttpRequest, so it avoids the No Access-Control-Allow-Origin error.
In the example, the code that is specific to the NBBO example is in demo.js, voltdb-dashboard.js contains code that is common to various example dashboards, and voltdb.js is the base library that provides access to call procedures asynchronously.

You should encode all the URI parameters not just the procedure parameters
$ curl --data 'Procedure=#AdHoc&Parameters=["select count(*) from store;"]' http://127.0.0.1:8080/api/1.0/
{"status":1,"appstatus":-128,"statusstring":null,"appstatusstring":null,"results":[{"status":-128,"schema":[{"name":"C1","type":6}],"data":[[100000]]}]}
or
$ curl --data 'Procedure=%40AdHoc&Parameters=%5B%22select+count(*)+from+store%3B%22%5D' http://127.0.0.1:8080/api/1.0/; echo
{"status":1,"appstatus":-128,"statusstring":null,"appstatusstring":null,"results":[{"status":-128,"schema":[{"name":"C1","type":6}],"data":[[100000]]}]}

Related

Getting Console Messages on Webpage NodeJS

I'm wondering if there's any way to listen for console messages and act on console messages when they're received. Mainly, is there any way to do this without an external module, and using the http module?
The goal is to trigger a NodeJS function or code snippet on an event like click in the HTML. If there's also a way to do this, then that's great. But once again, I'd like to do this without an external module, and just use those that are built-in to NodeJS.
Use onclick() function in JavaScript to trigger a function call when clicking on a element. Then use fetch to make a api call to the nodejs server.
I know #Haris Wilson already got the answer, but I'd just like to provide a code example.
Instead of trying to catch a console message and then execute a function if we find it, we can use fetch() to make a request to whatever URL we need, and this can allow us to make other requests.
In this case, we can use the url module and the http module to parse the url and serve the API and website, respectively.
const url = require('url')
const http = require('http')
const requestListener = async function (req, res) {
// Basic server setup
res.writeHead(200, {
'Content-Type': 'text/html'
});
res.end(/** Content here */)
// API
if (url.parse(req.url, true).pathname === '/APIcall') {
let arguments = url.parse(req.url, true).query
// Preform necassary actions here
}
}
We can now use onClick to call a function inside our webpage JavaScript, and use fetch([API URL]) to give our NodeJS data to preform an action. We can use URL params to do this, such as https://localhost:8080/APIcall?data=someData&moreParam=more-data, where ?data=someData&moreParam=more-data are the URL params.

How to pass event parameters to AWS Lambda function using API Gateway?

I have an AWS Lambda function written in python that is initiated by a Zapier trigger that I set up. As I pass some input parameters to the function in the Zapier trigger, I can access to the input parameters in my python code by using variables such as event[parameter1]. It perfectly works.
I'm trying to access the same Lambda function in Airtable Scripting environment. In order to do it, I set up an API Gateway trigger for the Lambda function, but I can't figure out how to pass input parameters in the vanilla JS environment. Below is the code that I have, which gives me "Internal Server Error".
Your help would be definitely appreciated!
const awsUrl = "https://random-id.execute-api.us-west-2.amazonaws.com/default/lambda-function";
let event = {
"queryStringParameters": {
"gdrive_folder_id": consFolderId,
"invitee_email": email
}
};
let response = await fetch(awsUrl, {
method: "POST",
body: JSON.stringify(event),
headers: {
"Content-Type": "application/json",
}
});
console.log(await response.json());
[Edited] Plus, here's the code of the Lambda function and the latest cloudwatch log after a successful execution invoked by Zapier. It's a simple code that automates Google Drive folder sharing based on 2 inputs. (Folder ID + email address) Please bear with me for the poor code quality!
from __future__ import print_function
from googleapiclient.discovery import build
from google.oauth2 import service_account
SCOPES = ['https://www.googleapis.com/auth/drive']
SERVICE_ACCOUNT_FILE = 'service.json'
def lambda_handler(event, context):
"""Shows basic usage of the Drive v3 API.
Prints the names and ids of the first 10 files the user has access to.
"""
# 2-legged OAuth from Google service account
creds = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES)
drive_service = build('drive', 'v3', credentials=creds)
# change multiple permissions with batch requests
folder_id = event['gdrive_folder_id']
email_address = event['invitee_email']
def callback(request_id, response, exception):
if exception:
# Handle error
print(exception)
else:
print("Permission Id: {}".format(response.get('id')))
batch = drive_service.new_batch_http_request(callback=callback)
user_permission = {
'type': 'user',
'role': 'writer',
'emailAddress': email_address
}
batch.add(drive_service.permissions().create(
fileId=folder_id,
body=user_permission,
fields='id',
))
batch.execute()
I'm not a Python expert and I don't know how you've setup your API Gateway integration with Lambda but I believe your code can have two issues:
1.) Internal Server Error as a response from the API Gateway endpoint also often refers to a problem in the integration between the API Gateway and your Lambda function. In this case here I can not see where you are returning a valid response back to the API Gateway. In your example the return value of batch.execute() is probably returned, right? However, by default the API Gateway expects an object that contains a statusCode and body and optionally headers. You can have a look at the AWS Lambda handler documentation for Python and their examples. Also this documentation page might be of interest for you.
2.) In your function you are accessing the event data like event['gdrive_folder_id']. However, I can not see that you are parsing the event data somewhere. Are you using a custom integration between your API Gateway? Because in case of a proxy integration the API Gateway sends an object that has a body field and from there you'd need to read the HTTP body. See examples on this documentation page.
Here are some more thing you can check on your own:
Have you also checked what you get when you just print the event data? Also, is the batch.execute() waiting for the batch processing or does it return anything? If so, what does it return?
One note here: You haven't told us anything about the integration between your API Gateway and your Lambda function. Since you can do some mapping between the API Gateway and AWS Lambda, it could be possible that you are converting the request and response outside of the Lambda function and hence, my suggestions above are wrong. Let me know if this is true or not and we can further investigate it.

Facebook Javascript SDK: Can't load a page's public feed [duplicate]

I'm trying to use the Facebook Graph API to get the latest status from a public page, let's say http://www.facebook.com/microsoft
According to http://developers.facebook.com/tools/explorer/?method=GET&path=microsoft%2Fstatuses - I need an access token. As the Microsoft page is 'public', is this definitely the case? Is there no way for me to access these public status' without an access token?
If this is the case, how is the correct method of creating an access token for my website? I have an App ID, however all of the examples at http://developers.facebook.com/docs/authentication/ describe handling user login. I simply want to get the latest status update on the Microsoft page and display it on my site.
This is by design. Once it was possible to fetch the latest status from a public page without access token. That was changed in order to block unidentified anonymous access to the API. You can get an access token for the application (if you don't have a Facebook application set for your website - you should create it) with the following call using graph API:
https://graph.facebook.com/oauth/access_token?
client_id=YOUR_APP_ID&client_secret=YOUR_APP_SECRET&
grant_type=client_credentials
This is called App Access Token. Then you proceed with the actual API call using the app access token from above.
hope this helps
You can use AppID and Secret key to get the public posts/feed of any page. This way you don't need to get the access-token. Call it like below.
https://graph.facebook.com/PAGE-ID/feed?access_token=APP-ID|APP-SECRET
And to get posts.
https://graph.facebook.com/PAGE-ID/posts?access_token=APP-ID|APP-SECRET
It's no more possible to use Facebook Graph API without access token for reading public page statuses, what is called Page Public Content Access in Facebook API permissions. Access token even is not enough. You have to use appsecret_proof along with the access token in order to validate that you are the legitimate user. https://developers.facebook.com/blog/post/v2/2018/12/10/verification-for-individual-developers/.
If you are individual developer, you have access to three pages of the data (limited), unless you own a business app.
You can get the posts by simply requesting the site that your browser would request and then extracting the posts from the HTML.
In NodeJS you can do it like this:
// npm i request cheerio request-promise-native
const rp = require('request-promise-native'); // requires installation of `request`
const cheerio = require('cheerio');
function GetFbPosts(pageUrl) {
const requestOptions = {
url: pageUrl,
headers: {
'User-Agent': 'Mozilla/5.0 (X11; Fedora; Linux x86_64; rv:64.0) Gecko/20100101 Firefox/64.0'
}
};
return rp.get(requestOptions).then( postsHtml => {
const $ = cheerio.load(postsHtml);
const timeLinePostEls = $('.userContent').map((i,el)=>$(el)).get();
const posts = timeLinePostEls.map(post=>{
return {
message: post.html(),
created_at: post.parents('.userContentWrapper').find('.timestampContent').html()
}
});
return posts;
});
}
GetFbPosts('https://www.facebook.com/pg/officialstackoverflow/posts/').then(posts=>{
// Log all posts
for (const post of posts) {
console.log(post.created_at, post.message);
}
});
For more information and an example of how to retrieve more than 20 posts see: https://stackoverflow.com/a/54267937/2879085
I had a similar use case for some weeks and I used this API:
https://rapidapi.com/axesso/api/axesso-facebook-data-service/
I could fetch all posts and comments in some minutes, worked quite well for me.

EventSource and basic http authentication

Does anyone know if it is possible to send basic http authentication credentials with EventSource?
I'm looking for a solution to the same problem. This post here says this:
Another caveat is that as far as we know, you cannot change the HTTP
headers when using EventSource, which means you have to submit an
authorization query string param with the value that you would have
inserted using HTTP Basic Auth: a base64 encoded concatenation of your
login and a token.
Here is the code from the post:
// First, we create the event source object, using the right URL.
var url = "https://stream.superfeedr.com/?";
url += "&hub.mode=retrieve";
url += "&hub.topic=http%3A%2F%2Fpush-pub.appspot.com%2Ffeed";
url += "&authorization=anVsaWVuOjJkNTVjNDhjMDY5MmIzZWFkMjA4NDFiMGViZDVlYzM5";
var source = new EventSource(url);
// When the socket has been open, let's cleanup the UI.
source.onopen = function () {
var node = document.getElementById('sse-feed');
while (node.hasChildNodes()) {
node.removeChild(node.lastChild);
}
};
// Superfeedr will trigger 'notification' events, which corresponds
// exactly to the data sent to your subscription endpoint
// (webhook or XMPP JID), with a JSON payload by default.
source.addEventListener("notification", function(e) {
var notification = JSON.parse(e.data);
notification.items.sort(function(x, y) {
return x.published - y.published;
});
notification.items.forEach(function(i) {
var node = document.getElementById('sse-feed');
var item = document.createElement("li");
var t = document.createTextNode([new Date(i.published * 1000), i.title, i.content].join(' '));
item.appendChild(t);
node.insertBefore(item, node.firstChild);
// We add the element to the UI.
});
});
If your talk about cookies (not http auth):
EventSource uses http, so cookies are sent with the EventSource connection request.
Http auth should be supported as any other http url, although from the spec CORS+http auth is not supported.
Nowadays there is a NPM package to change the HTTP Header
https://www.npmjs.com/package/eventsource
This library is a pure JavaScript implementation of the EventSource
client. The API aims to be W3C compatible.
You can use it with Node.js or as a browser polyfill for browsers that
don't have native EventSource support.
You can use event-source-polyfill to add headers like this
import { EventSourcePolyfill } from 'event-source-polyfill';
new EventSourcePolyfill(`/api/liveUpdate`, {
headers: {
Authorization: `Bearer 12345`,
'x-csrf-token': `xxx-xxx-xxx`,
},
});
EventSource is about the server sending events to the client. I think you need bidirectional communication for authentication. How would you otherwise send the actual credentials?
WebSockets, however, can achieve that. Is that what you are looking for?
Update:
You can achieve what you want by utilizing cookies, as pointed out by 4esn0k. Cookies are sent along with the initial request that the browser makes to establish the connection. So, just make sure you set the session identifier for the cookie before launching any EventSource connections.

HTTP headers in Websockets client API

Looks like it's easy to add custom HTTP headers to your websocket client with any HTTP header client which supports this, but I can't find how to do it with the web platform's WebSocket API.
Anyone has a clue on how to achieve it?
var ws = new WebSocket("ws://example.com/service");
Specifically, I need to be able to send an HTTP Authorization header.
Updated 2x
Short answer: No, only the path and protocol field can be specified.
Longer answer:
There is no method in the JavaScript WebSockets API for specifying additional headers for the client/browser to send. The HTTP path ("GET /xyz") and protocol header ("Sec-WebSocket-Protocol") can be specified in the WebSocket constructor.
The Sec-WebSocket-Protocol header (which is sometimes extended to be used in websocket specific authentication) is generated from the optional second argument to the WebSocket constructor:
var ws = new WebSocket("ws://example.com/path", "protocol");
var ws = new WebSocket("ws://example.com/path", ["protocol1", "protocol2"]);
The above results in the following headers:
Sec-WebSocket-Protocol: protocol
and
Sec-WebSocket-Protocol: protocol1, protocol2
A common pattern for achieving WebSocket authentication/authorization is to implement a ticketing system where the page hosting the WebSocket client requests a ticket from the server and then passes this ticket during WebSocket connection setup either in the URL/query string, in the protocol field, or required as the first message after the connection is established. The server then only allows the connection to continue if the ticket is valid (exists, has not been already used, client IP encoded in ticket matches, timestamp in ticket is recent, etc). Here is a summary of WebSocket security information: https://devcenter.heroku.com/articles/websocket-security
Basic authentication was formerly an option but this has been deprecated and modern browsers don't send the header even if it is specified.
Basic Auth Info (Deprecated - No longer functional):
NOTE: the following information is no longer accurate in any modern browsers.
The Authorization header is generated from the username and password (or just username) field of the WebSocket URI:
var ws = new WebSocket("ws://username:password#example.com")
The above results in the following header with the string "username:password" base64 encoded:
Authorization: Basic dXNlcm5hbWU6cGFzc3dvcmQ=
I have tested basic auth in Chrome 55 and Firefox 50 and verified that the basic auth info is indeed negotiated with the server (this may not work in Safari).
Thanks to Dmitry Frank's for the basic auth answer
More of an alternate solution, but all modern browsers send the domain cookies along with the connection, so using:
var authToken = 'R3YKZFKBVi';
document.cookie = 'X-Authorization=' + authToken + '; path=/';
var ws = new WebSocket(
'wss://localhost:9000/wss/'
);
End up with the request connection headers:
Cookie: X-Authorization=R3YKZFKBVi
Sending Authorization header is not possible.
Attaching a token query parameter is an option. However, in some circumstances, it may be undesirable to send your main login token in plain text as a query parameter because it is more opaque than using a header and will end up being logged whoknowswhere. If this raises security concerns for you, an alternative is to use a secondary JWT token just for the web socket stuff.
Create a REST endpoint for generating this JWT, which can of course only be accessed by users authenticated with your primary login token (transmitted via header). The web socket JWT can be configured differently than your login token, e.g. with a shorter timeout, so it's safer to send around as query param of your upgrade request.
Create a separate JwtAuthHandler for the same route you register the SockJS eventbusHandler on. Make sure your auth handler is registered first, so you can check the web socket token against your database (the JWT should be somehow linked to your user in the backend).
HTTP Authorization header problem can be addressed with the following:
var ws = new WebSocket("ws://username:password#example.com/service");
Then, a proper Basic Authorization HTTP header will be set with the provided username and password. If you need Basic Authorization, then you're all set.
I want to use Bearer however, and I resorted to the following trick: I connect to the server as follows:
var ws = new WebSocket("ws://my_token#example.com/service");
And when my code at the server side receives Basic Authorization header with non-empty username and empty password, then it interprets the username as a token.
You cannot add headers but, if you just need to pass values to the server at the moment of the connection, you can specify a query string part on the url:
var ws = new WebSocket("ws://example.com/service?key1=value1&key2=value2");
That URL is valid but - of course - you'll need to modify your server code to parse it.
You can not send custom header when you want to establish WebSockets connection using JavaScript WebSockets API.
You can use Subprotocols headers by using the second WebSocket class constructor:
var ws = new WebSocket("ws://example.com/service", "soap");
and then you can get the Subprotocols headers using Sec-WebSocket-Protocol key on the server.
There is also a limitation, your Subprotocols headers values can not contain a comma (,) !
For those still struggling in 2021, Node JS global web sockets class has an additional options field in the constructor. if you go to the implementation of the the WebSockets class, you will find this variable declaration. You can see it accepts three params url, which is required, protocols(optional), which is either a string, an array of strings or null. Then a third param which is options. our interest, an object and (still optional). see ...
declare var WebSocket: {
prototype: WebSocket;
new (
uri: string,
protocols?: string | string[] | null,
options?: {
headers: { [headerName: string]: string };
[optionName: string]: any;
} | null,
): WebSocket;
readonly CLOSED: number;
readonly CLOSING: number;
readonly CONNECTING: number;
readonly OPEN: number;
};
If you are using a Node Js library like react , react-native. here is an example of how you can do it.
const ws = new WebSocket(WEB_SOCKETS_URL, null, {
headers: {
['Set-Cookie']: cookie,
},
});
Notice for the protocols I have passed null. If you are using jwt, you can pass the Authorisation header with Bearer + token
Disclaimer, this might not be supported by all browsers outside the box, from the MDN web docs you can see only two params are documented.
see https://developer.mozilla.org/en-US/docs/Web/API/WebSocket/WebSocket#syntax
Totally hacked it like this, thanks to kanaka's answer.
Client:
var ws = new WebSocket(
'ws://localhost:8080/connect/' + this.state.room.id,
store('token') || cookie('token')
);
Server (using Koa2 in this example, but should be similar wherever):
var url = ctx.websocket.upgradeReq.url; // can use to get url/query params
var authToken = ctx.websocket.upgradeReq.headers['sec-websocket-protocol'];
// Can then decode the auth token and do any session/user stuff...
In my situation (Azure Time Series Insights wss://)
Using the ReconnectingWebsocket wrapper and was able to achieve adding headers with a simple solution:
socket.onopen = function(e) {
socket.send(payload);
};
Where payload in this case is:
{
"headers": {
"Authorization": "Bearer TOKEN",
"x-ms-client-request-id": "CLIENT_ID"
},
"content": {
"searchSpan": {
"from": "UTCDATETIME",
"to": "UTCDATETIME"
},
"top": {
"sort": [
{
"input": {"builtInProperty": "$ts"},
"order": "Asc"
}],
"count": 1000
}}}
to all future debugger - until today i.e 15-07-21
Browser also don't support sending customer headers to the server, so any such code
import * as sock from 'websocket'
const headers = {
Authorization: "bearer " + token
};
console.log(headers);
const wsclient = new sock.w3cwebsocket(
'wss://' + 'myserver.com' + '/api/ws',
'',
'',
headers,
null
);
This is not going to work in browser. The reason behind that is browser native Websocket constructor does not accept headers.
You can easily get misguided because w3cwebsocket contractor accepts headers as i have shown above. This works in node.js however.
The recommended way to do this is through URL query parameters
// authorization: Basic abc123
// content-type: application/json
let ws = new WebSocket(
"ws://example.com/service?authorization=basic%20abc123&content-type=application%2Fjson"
);
This is considered a safe best-practice because:
Headers aren't supported by WebSockets
Headers are advised against during the HTTP -> WebSocket upgrade because CORS is not enforced
SSL encrypts query paramaters
Browsers don't cache WebSocket connections the same way they do with URLs
What I have found works best is to send your jwt to the server just like a regular message. Have the server listening for this message and verify at that point. If valid add it to your stored list of connections. Otherwise send back a message saying it was invalid and close the connection. Here is the client side code. For context the backend is a nestjs server using Websockets.
socket.send(
JSON.stringify({
event: 'auth',
data: jwt
})
);
My case:
I want to connect to a production WS server a www.mycompany.com/api/ws...
using real credentials (a session cookie)...
from a local page (localhost:8000).
Setting document.cookie = "sessionid=foobar;path=/" won't help as domains don't match.
The solution:
Add 127.0.0.1 wsdev.company.com to /etc/hosts.
This way your browser will use cookies from mycompany.com when connecting to www.mycompany.com/api/ws as you are connecting from a valid subdomain wsdev.company.com.
You can pass the headers as a key-value in the third parameter (options) inside an object.
Example with Authorization token. Left the protocol (second parameter) as null
ws = new WebSocket(‘ws://localhost’, null, { headers: { Authorization: token }})
Edit: Seems that this approach only works with nodejs library not with standard browser implementation. Leaving it because it might be useful to some people.
Technically, you will be sending these headers through the connect function before the protocol upgrade phase. This worked for me in a nodejs project:
var WebSocketClient = require('websocket').client;
var ws = new WebSocketClient();
ws.connect(url, '', headers);

Categories

Resources