BINANCE API - How to get Account info with User Data Stream - javascript

I'm using Node and the ws npm package to work with WebSockets. Got the listenKey as stated in the docs (below), but I'm unable to get my account info using User Data Stream. I'd prefer to use a stream to read my most current account info (balances, etc) since using the Rest API to do it incurs a penalty (WEIGHT: 5) each time.
I've tried doing ws.send('outboundAccountInfo') but no joy.
DOCS: https://github.com/binance-exchange/binance-official-api-docs/blob/master/user-data-stream.md
Full code example - does not return any data:
import request from 'request'
import WebSocket from 'ws'
import { API_KEY } from '../../assets/secrets'
const DATA_STREAM_ENDPOINT = 'wss://stream.binance.com:9443/ws'
const BINANCE_API_ROOT = 'https://api.binance.com'
const LISTEN_KEY_ENDPOINT = `${BINANCE_API_ROOT}/api/v1/userDataStream`
const fetchAccountWebsocketData = async() => {
const listenKey = await fetchListenKey()
console.log('-> ', listenKey) // valid key is returned
let ws
try {
ws = await openWebSocket(`${DATA_STREAM_ENDPOINT}/${listenKey}`)
} catch (err) {
throw(`ERROR - fetchAccountWebsocketData: ${err}`)
}
// Nothing returns from either
ws.on('message', data => console.log(data))
ws.on('outboundAccountInfo', accountData => console.log(accountData))
}
const openWebSocket = endpoint => {
const p = new Promise((resolve, reject) => {
const ws = new WebSocket(endpoint)
console.log('\n-->> New Account Websocket')
ws.on('open', () => {
console.log('\n-->> Websocket Account open...')
resolve(ws)
}, err => {
console.log('fetchAccountWebsocketData error:', err)
reject(err)
})
})
p.catch(err => console.log(`ERROR - fetchAccountWebsocketData: ${err}`))
return p
}
const fetchListenKey = () => {
const p = new Promise((resolve, reject) => {
const options = {
url: LISTEN_KEY_ENDPOINT,
headers: {'X-MBX-APIKEY': API_KEY}
}
request.post(options, (err, httpResponse, body) => {
if (err)
return reject(err)
resolve(JSON.parse(body).listenKey)
})
})
p.catch(err => console.log(`ERROR - fetchListenKey: ${err}`))
return p
}
export default fetchAccountWebsocketData

Was stuggling too .... for hours !!!
https://www.reddit.com/r/BinanceExchange/comments/a902cq/user_data_streams_has_anyone_used_it_successfully/
The binance user data stream doesn't return anything when you connect
to it, only when something changes in your account. Try running your
code, then go to binance and place an order in the book, you should
see some data show up*

Related

What am I missing here to get data out of this spawned Node.js child process?

I'm trying to use a spawned command-line lzip process to expand an lzipped data stream, as I haven't found any good native JavaScript tools to do the job.
I can get this to work using files and file descriptors, but it seems stupid to have to write out, and read back in, a bunch of temporary scratch files. I want to do all of the work I can in memory.
So here's the code I'm trying to use:
import { requestBinary } from 'by-request';
import { spawn } from 'child_process';
import { min } from '#tubular/math';
export async function tarLzToZip(url: string): Promise<void> {
const lzData = await requestBinary(url, { headers: { 'User-Agent': 'curl/7.64.1' } });
const lzipProc = spawn('lzip', ['-d'], { stdio: ['pipe', 'pipe', process.stderr] });
let tarContent = Buffer.alloc(0);
lzipProc.stdout.on('data', data => {
tarContent = Buffer.concat([tarContent, data], tarContent.length + data.length);
});
for (let offset = 0; offset < lzData.length; offset += 4096) {
await new Promise<void>((resolve, reject) => {
lzipProc.stdin.write(lzData.slice(offset, min(offset + 4096, lzData.length)), err => {
if (err)
reject(err);
else
resolve();
});
});
}
await new Promise<void>((resolve, reject) => {
lzipProc.stdin.end((err: any) => {
if (err)
reject(err);
else
resolve();
});
});
console.log('data length:', tarContent.length);
}
When I step through with a debugger everything seems to be going well with the sending data into lzipProc.stdin. (I've tried doing both chunks like this, and all data in one go.) lzipProc.stdout.on('data', data =>, however, never gets called. When I get to the end, tarContent is empty.
What's missing here? Do I need a different stdio config? Are there different stream objects I should be using? Do I need to more goats to sacrifice under the light of a full moon?
UPDATE
My solution based on Matt's excellent answer posted below, with all of the particulars for my use case:
import archiver from 'archiver';
import fs, { ReadStream } from 'fs';
import fsp from 'fs/promises';
import needle from 'needle';
import path from 'path';
import { spawn } from 'child_process';
import tar from 'tar-stream';
const baseUrl = 'https://data.iana.org/time-zones/releases/';
export async function codeAndDataToZip(version: string): Promise<ReadStream> {
return compressedTarToZip(`${baseUrl}tzdb-${version}.tar.lz`);
}
export async function codeToZip(version: string): Promise<ReadStream> {
return compressedTarToZip(`${baseUrl}tzcode${version}.tar.gz`);
}
export async function dataToZip(version: string): Promise<ReadStream> {
return compressedTarToZip(`${baseUrl}tzdata${version}.tar.gz`);
}
async function compressedTarToZip(url: string): Promise<ReadStream> {
const fileName = /([-a-z0-9]+)\.tar\.[lg]z$/i.exec(url)[1] + '.zip';
const filePath = path.join(process.env.TZE_ZIP_DIR || path.join(__dirname, 'tz-zip-cache'), fileName);
if (await fsp.stat(filePath).catch(() => false))
return fs.createReadStream(filePath);
const [command, args] = url.endsWith('.lz') ? ['lzip', ['-d']] : ['gzip', ['-dc']];
const originalArchive = needle.get(url, { headers: { 'User-Agent': 'curl/7.64.1' } });
const tarExtract = tar.extract({ allowUnknownFormat: true });
const zipPack = archiver('zip');
const writeFile = fs.createWriteStream(filePath);
const commandProc = spawn(command, args);
commandProc.stderr.on('data', msg => { throw new Error(`${command} error: ${msg}`); });
commandProc.stderr.on('error', err => { throw err; });
originalArchive.pipe(commandProc.stdin);
commandProc.stdout.pipe(tarExtract);
tarExtract.on('entry', (header, stream, next) => {
zipPack.append(stream, { name: header.name, date: header.mtime });
stream.on('end', next);
});
tarExtract.on('finish', () => zipPack.finalize());
zipPack.pipe(writeFile);
return new Promise<ReadStream>((resolve, reject) => {
const rejectWithError = (err: any): void =>
reject(err instanceof Error ? err : new Error(err.message || err.toString()));
writeFile.on('error', rejectWithError);
writeFile.on('finish', () => resolve(fs.createReadStream(filePath)));
tarExtract.on('error', err => {
// tar-stream has a problem with the format of a few of the tar files
// dealt with here, which nevertheless are valid archives.
if (/unexpected end of data|invalid tar header/i.test(err.message))
console.error('Archive %s: %s', url, err.message);
else
reject(err);
});
zipPack.on('error', rejectWithError);
zipPack.on('warning', rejectWithError);
commandProc.on('error', rejectWithError);
commandProc.on('exit', err => err && reject(new Error(`${command} error: ${err}`)));
originalArchive.on('error', rejectWithError);
});
}
I would leave the streaming to node or packages, unless you have specific processing that needs to be done. Just wrap the whole stream setup in a promise.
If you also stream the request/response, it can be piped into the decompresser. Then stdout from the decompressor can be piped to the archive stream handlers.
import fs from 'fs'
import { spawn } from 'child_process'
import needle from 'needle'
import tar from 'tar-stream'
import archiver from 'archiver'
export function tarLzToZip(url) {
return new Promise((resolve, reject) => {
// Setup streams
const res = needle.get(url)
const lzipProc = spawn('lzip', ['-dc'], { stdio: ['pipe','pipe',process.stderr] })
const tarExtract = tar.extract()
const zipPack = archiver('zip')
const writeFile = fs.createWriteStream('tardir.zip')
// Pipelines and processing
res.pipe(gzipProc.stdin)
lzipProc.stdout.pipe(tarExtract)
// tar -> zip (simple file name)
tarExtract.on('entry', function(header, stream, next) {
console.log('entry', header)
zipPack.append(stream, { name: header.name })
stream.on('end', () => next())
})
tarExtract.on('finish', function() {
zipPack.finalize()
})
zipPack.pipe(writeFile)
// Handle the things
writeFile.on('error', reject)
writeFile.on('close', () => console.log('write close'))
writeFile.on('finish', resolve(true))
tarExtract.on('error', reject)
zipPack.on('error', reject)
zipPack.on('warning', reject)
lzipProc.on('error', reject)
lzipProc.on('exit', code => {if (code !== 0) reject(new Error(`lzip ${code}`))})
res.on('error', reject)
res.on('done', ()=> console.log('request done', res.request.statusCode))
})
}
You might want to be a bit more verbose about logging errors and stderr as the singular promise reject can easily hide what actually happened across the multiple streams.

Returning promises won't work for AWS SDK

I've created an API which calls get cloudWatch AWS API and gives back datapoints that can be graphed on my app. I have separate routes for each package (as shown in the routing code below). This API uses REST MVC Method.
So a couple things I'm doing with my function.
Reading in EC2 Instance data from a SQLite3 database to grab
information about a running instance (IP, instance_id,
instance_launchtime) so that I can put it in the parameters required
for the getMetricStatistics API from the AWS SDK.
This data from step1 is then put into an array of parameters (3 that respond with 3 different metric datapoints). This loops through each parameter, inserting it into the getMetricStatistics API (ONE BY ONE SINCE getMetricStatistics doesn't accept multiple metrics at once) to grab data points for that instance and push them to an array.
For the database is async I believe, that is why I've attached a promise to it. When I load in the endpoint into my browser, it just keeps loading and won't show any data. When I do refresh the page, however, it shows all the results correctly...
This is my controller for the API:
// Return results sent from Cloud Watch API
const InsightModel = require('../models/insight.model.js');
const cloudWatch = InsightModel.cloudWatch;
const CWParams = InsightModel.CWParams;
const packageById = InsightModel.packageById;
let cpuUtilParam;
let cpuCBParam;
let cpuCUParam;
let insightParams = [];
let metricResults = [];
exports.getAnalytics = (req, res) => {
const currentDate = new Date().toISOString();
let promise1 = new Promise((resolve, reject) => {
packageById(req.params.packageKey, (err, data) => {
if (err) {
reject(
res.status(500).send({
message:
err.message ||
'Error while getting the insight configuration data.',
})
);
} else {
cpuUtilParam = new CWParams(
currentDate,
'CPUUtilization',
'AWS/EC2',
data[0].launch_time,
data[0].instance_id
);
cpuCBParam = new CWParams(
currentDate,
'CPUCreditBalance',
'AWS/EC2',
data[0].launch_time,
data[0].instance_id
);
cpuCUParam = new CWParams(
currentDate,
'CPUCreditUsage',
'AWS/EC2',
data[0].launch_time,
data[0].instance_id
);
insightParams = [cpuUtilParam, cpuCBParam, cpuCUParam];
resolve(insightParams);
}
});
})
let promise2 = new Promise((resolve, reject) => {
insightParams.forEach(metric => {
cloudWatch.getMetricStatistics(metric, function(err, data) {
if (err) {
reject(
res.status(500).send({
messaage:
err.message ||
'Error occured while running cloudWatch getMetricStatistcs API: ',
})
);
} else {
metricResults.push(data);
if (metricResults.length === insightParams.length)
resolve(metricResults);
}
});
});
});
Promise.all([promise1, promise2])
.then(metricResults => {
res.send(metricResults);
console.log('AWS CW API successful');
})
.catch(err => {
res.status(500).send({
messaage:
err.message ||
'Error occured while reading in a promise from cloudWatch getMetricStatistcs API: ',
})
});
metricResults = [];
};
The model for the API:
// Call AWS Cost Explorer API
const AWS = require('aws-sdk');
const config = require('./AWSconfig');
const database = require('./db');
const insightdb = database.insightdb;
AWS.config.update({
accessKeyId: config.accessKeyId,
secretAccessKey: config.secretAccessKey,
region: config.region,
});
//Linking AWS CloudWatch Service
var cloudWatch = new AWS.CloudWatch();
const packageById = (packageId, callback) => {
insightdb.all(
'SELECT * FROM ec2Instance WHERE package_id == ?',
packageId,
(err, rows) => {
if (err) {
callback(err, null);
} else {
callback(null, rows);
}
}
);
};
// Parameter class to feed into the CloudWatch getMetricStatistics function
const CWParams = function(reqDate, metricName,service,launchTime,instanceId) {
(this.EndTime = reqDate) /* required */,
(this.MetricName = metricName) /* required */,
(this.Namespace = service) /* required */,
(this.Period = 3600) /* required */,
(this.StartTime = launchTime) /* ${createDate}`, required */,
(this.Dimensions = [
{
Name: 'InstanceId' /* required */,
Value: instanceId /* required */,
},
]),
(this.Statistics = ['Maximum']);
};
//Exports variables to the controller (so they can be re-used)
module.exports = { cloudWatch, CWParams, packageById };
The route for the API:
module.exports = app => {
const insight = require('../controllers/insight.controller.js');
app.get('/insights/aws/:packageKey', insight.getAnalytics);
};
As it stands, in the second Promise constructor, insightParams is guaranteed not to have been composed yet because insightParams = [.....] is in a callback that is called asynchronously. Therefore, the program flow needs to ensure all the "promise2" stuff happens only after "promise1" is fulfilled.
Things become a lot simpler in the higher level code if asynchronous functions are "promisified" at the lowest possible level. So do two things in the model:
Promisify cloudWatch.getMetricStatistics()
Write packageById() to return Promise rather than accepting a callback.
The model thus becomes:
const AWS = require('aws-sdk'); // no change
const config = require('./AWSconfig'); // no change
const database = require('./db'); // no change
const insightdb = database.insightdb; // no change
AWS.config.update({
accessKeyId: config.accessKeyId,
secretAccessKey: config.secretAccessKey,
region: config.region
}); // no change
var cloudWatch = new AWS.CloudWatch(); // no change
// Promisify cloudWatch.getMetricStatistics() as cloudWatch.getMetricStatisticsAsync().
cloudWatch.getMetricStatisticsAsync = (metric) => {
return new Promise((resolve, reject) => {
cloudWatch.getMetricStatistics(metric, function(err, data) {
if (err) {
if(!err.message) { // Probably not necessary but here goes ...
err.message = 'Error occured while running cloudWatch getMetricStatistcs API: ';
}
reject(err); // (very necessary)
} else {
resolve(data);
}
});
});
};
// Ensure that packageById() returns Promise rather than accepting a callback.
const packageById = (packageId) => {
return new Promise((resolve, reject) => {
insightdb.all('SELECT * FROM ec2Instance WHERE package_id == ?', packageId, (err, rows) => {
if (err) {
reject(err);
} else {
resolve(rows);
}
});
});
};
Now getAnalytics() can be written like this:
exports.getAnalytics = (req, res) => {
packageById(req.params.packageKey)
.then(data => {
const currentDate = new Date().toISOString();
let insightParams = [
new CWParams(currentDate, 'CPUUtilization', 'AWS/EC2', data[0].launch_time, data[0].instance_id),
new CWParams(currentDate, 'CPUCreditBalance', 'AWS/EC2', data[0].launch_time, data[0].instance_id),
new CWParams(currentDate, 'CPUCreditUsage', 'AWS/EC2', data[0].launch_time, data[0].instance_id)
];
// Composition of `insightParams` is synchronous so you can continue
// with the `cloudWatch.getMetricStatisticsAsync()` stuff inside the same .then().
return Promise.all(insightParams.map(metric => cloudWatch.getMetricStatisticsAsync(metric))); // Simple because of the Promisification above.
}, err => {
// This callback handles error from packageById() only,
// and is probably unnecessary but here goes ...
if(!err.message) {
err.message = 'Error while getting the insight configuration data.';
}
throw err;
})
.then(metricResults => {
res.send(metricResults);
console.log('AWS CW API successful');
})
.catch(err => {
// Any async error arising above will drop through to here.
res.status(500).send({
'message': err.message
}));
});
};
Note that multiple catches each with res.status(500).send() are not necessary. Error propagation down the Promise chain allows a single, terminal .catch()

Transaction numbers are only allowed on storage engines that support document-level locking - MongodbMemoryServer/Mochai/Chai/Supertest

FIXED: USER storageEngine: "wiredTiger"
I use Mocha / Chai / Supertest and Mongodb-Memory-Server to test my app. But's I received error: Transaction numbers are only allowed on storage engines that support document-level locking
In real database and test by postman, it's working well.
My code:
In database.js
const mongoose = require('mongoose')
const { MongoMemoryReplSet } = require('mongodb-memory-server')
mongoose.set('useFindAndModify', false);
const connect = async () => {
try {
let url = process.env.MONGO_URL
let options = {
//Something
}
if (process.env.NODE_ENV === 'test') {
const replSet = new MongoMemoryReplSet();
await replSet.waitUntilRunning();
const uri = await replSet.getUri();
await mongoose.connect(uri, options)
//log connected
} else {
await mongoose.connect(url, options)
//log connected
}
} catch (error) {
//error
}
}
I have two model: Company and User. I made a function to add a member to company with used transaction. My code
const addMember = async (req, res, next) => {
const { companyId } = req.params
const { userId } = req.body
const session = await mongoose.startSession()
try {
await session.withTransaction(async () => {
const [company, user] = await Promise.all([
Company.findOneAndUpdate(
//Something
).session(session),
User.findByIdAndUpdate(
//Something
).session(session)
])
//Something if... else
return res.json({
message: `Add member successfully!`,
})
})
} catch (error) {
//error
}
}
Here's router:
router.post('/:companyId/add-member',
authentication.required,
company.addMember
)
Test file:
const expect = require('chai').expect
const request = require('supertest')
const app = require('../app')
describe('POST /company/:companyId/add-member', () => {
it('OK, add member', done => {
request(app).post(`/company/${companyIdEdited}/add-member`)
.set({ "x-access-token": signedUserTokenKey })
.send({userId: memberId})
.then(res => {
console.log(res.body)
expect(res.statusCode).to.equals(200)
done()
})
.catch((error) => done(error))
})
})
And i received error: Transaction numbers are only allowed on storage engines that support document-level locking'
How can I fix this?
Add retryWrites=false to your database uri. Example below:
mongodb://xx:xx#xyz.com:PORT,zz.com:33427/database-name?replicaSet=rs-xx&ssl=true&retryWrites=false

How can I connect multichain node with app?

I must create an app based on multichain for a project with my university. I must use the blockchain as database and I should create an app to put information in the block.
I’m trying to use Meteor and JSON-RPC API (https://github.com/scoin/multichain-node) but I can’t connect the node. Someone could help me? Or someone could suggest me an alternative to Meteor?
I installed multichain-node with
npm install multichain-node --save
this created multichain-node folder in node_modules.
In my main.js i'm trying to connect with the node (that is running in Terminal)
import './main.html';
console.log("b4 connection");
const connection = {
port: 6744,
host: '127.0.0.1',
user: "multichainrpc",
pass: "5zGVBTY7nVsnEmp3vbGq8LTbmnmjueYkiTLc5pRzE7xh"
}
const multichain = require("../node_modules/multichain-node/index.js")(connection);
console.log("info");
let listenForConfirmations = (txid) => {
console.log("WAITING FOR CONFIRMATIONS")
return new Promise((resolve, reject) => {
var interval = setInterval(() => {
getConfirmations(txid)
.then(confirmations => {
if(confirmations > 0){
clearInterval(interval);
return resolve()
}
})
.catch(err => {
return reject(err);
})
}, 5000)
})
}
let getConfirmations = async (txid) => {
let res = await multichain.getWalletTransaction({
txid: txid
})
return res.confirmations;
}
let startTests = () => {
const state = {};
console.log("Running Tests")
console.log("TEST: GET INFO")
multichain.getInfo((err, res) => {
console.log(res);
})
}
startTests()
This is the error in Chrome console:

how to receive data from bluetooth device using node.js

I am new to javascript and node.js. Currently am working in medical project. First i will explain my work. I have to receive data from Bluetooth device (normal BP rate ,pulse rate ) and display the readings in the web app using node.js. I don't know how to receive data from Bluetooth device (patient monitor machine) can you guys suggest me some blogs or books to read. Thanks in advance.
You can use "node-bluetooth" to send and receive data from and to a device respectively. This is a sample code:-
const bluetooth = require('node-bluetooth');
// create bluetooth device instance
const device = new bluetooth.DeviceINQ();
device
.on('finished', console.log.bind(console, 'finished'))
.on('found', function found(address, name) {
console.log('Found: ' + address + ' with name ' + name);
device.findSerialPortChannel(address, function(channel) {
console.log('Found RFCOMM channel for serial port on %s: ', name, channel);
// make bluetooth connect to remote device
bluetooth.connect(address, channel, function(err, connection) {
if (err) return console.error(err);
connection.write(new Buffer('Hello!', 'utf-8'));
});
});
// make bluetooth connect to remote device
bluetooth.connect(address, channel, function(err, connection) {
if (err) return console.error(err);
connection.on('data', (buffer) => {
console.log('received message:', buffer.toString());
});
connection.write(new Buffer('Hello!', 'utf-8'));
});
}).inquire();
It scans for the device name given in "device" variable.
Try noble library. That's how I get information about my Xiaomi Mi Band 3 device:
const arrayBufferToHex = require('array-buffer-to-hex')
const noble = require('noble')
const DEVICE_INFORMATION_SERVICE_UUID = '180a'
noble.on('stateChange', state => {
console.log(`State changed: ${state}`)
if (state === 'poweredOn') {
noble.startScanning()
}
})
noble.on('discover', peripheral => {
console.log(`Found device, name: ${peripheral.advertisement.localName}, uuid: ${peripheral.uuid}`)
if (peripheral.advertisement.localName === 'Mi Band 3') {
noble.stopScanning()
peripheral.on('connect', () => console.log('Device connected'))
peripheral.on('disconnect', () => console.log('Device disconnected'))
peripheral.connect(error => {
peripheral.discoverServices([DEVICE_INFORMATION_SERVICE_UUID], (error, services) => {
console.log(`Found service, name: ${services[0].name}, uuid: ${services[0].uuid}, type: ${services[0].type}`)
const service = services[0]
service.discoverCharacteristics(null, (error, characteristics) => {
characteristics.forEach(characteristic => {
console.log(`Found characteristic, name: ${characteristic.name}, uuid: ${characteristic.uuid}, type: ${characteristic.type}, properties: ${characteristic.properties.join(',')}`)
})
characteristics.forEach(characteristic => {
if (characteristic.name === 'System ID' || characteristic.name === 'PnP ID') {
characteristic.read((error, data) => console.log(`${characteristic.name}: 0x${arrayBufferToHex(data)}`))
} else {
characteristic.read((error, data) => console.log(`${characteristic.name}: ${data.toString('ascii')}`))
}
})
})
})
})
}
})
You can use node-ble a Node.JS library that leverages on D-Bus and avoids C++ bindings.
https://github.com/chrvadala/node-ble
Here a basic example
async function main () {
const { bluetooth, destroy } = createBluetooth()
// get bluetooth adapter
const adapter = await bluetooth.defaultAdapter()
await adapter.startDiscovery()
console.log('discovering')
// get device and connect
const device = await adapter.waitDevice(TEST_DEVICE)
console.log('got device', await device.getAddress(), await device.getName())
await device.connect()
console.log('connected')
const gattServer = await device.gatt()
// read write characteristic
const service1 = await gattServer.getPrimaryService(TEST_SERVICE)
const characteristic1 = await service1.getCharacteristic(TEST_CHARACTERISTIC)
await characteristic1.writeValue(Buffer.from('Hello world'))
const buffer = await characteristic1.readValue()
console.log('read', buffer, buffer.toString())
// subscribe characteristic
const service2 = await gattServer.getPrimaryService(TEST_NOTIFY_SERVICE)
const characteristic2 = await service2.getCharacteristic(TEST_NOTIFY_CHARACTERISTIC)
await characteristic2.startNotifications()
await new Promise(done => {
characteristic2.once('valuechanged', buffer => {
console.log('subscription', buffer)
done()
})
})
await characteristic2.stopNotifications()
destroy()
}

Categories

Resources