I have a sails app created using the options --no-linker --no-front end. The front end of the application is written in angular2. Making request get and post, seems to work fine.
When i send a get request to the route (to subscribe), i don't get any updates on model being created, updated or destroyed.
I also created a special action, so i could do thing myself, but still with no luck.
The updates performed on the route are made using socket. I don't know where am wrong here. Find my code below
import { Injectable, OnInit, EventEmitter } from '#angular/core'
import { Subject } from 'rxjs/Subject';
import { Donor } from './donor.interface';
import * as socketIO from 'socket.io-client'
import * as sailsIO from 'sails.io'
const url = 'http://localhost:1337'
const io = sailsIO(socketIO)
io.sails.reconnection = true;
io.sails.url = url;
io.socket.on('connect', function () {
console.log("connected to server")
io.socket.get('/donor', function (data, jwres) {
console.log("i subscribed", data, jwres)
})
io.socket.get('/donor/hello', function (data, jwres) {
console.log("i subscribed with hello", data, jwres)
})
io.socket.on('donor', function (data) {
console.log("new donor was created", data)
});
});
io.socket.on('disconnect', function () {
console.log('Lost connection to server');
});
DonorController.js
module.exports = {
hello: function (req, res) {
if (req.isSocket) {
Donor.watch(req.socket)
console.log("new subscriber found")
} else {
console.log("not a socket req")
}
return res.ok();
}
};
So i figured out the problem. When you make an update (CRUD), the socket performing such operation does not receive an update.
I spent hours before figuring this out. So what i do is act on the data if the CRUD operation is successful like what i would get listing on a model with on
Related
The socket have to listen to the server when i send a new message.
The messages arrive because when i refresh the page i can see them.
Server side socket i know is working find because in other frontend app the same client socket is working find.
This socket is called every time the user select a new chat in the web.
Here is the socket service in my app:
import openSocket from "socket.io-client";
function connectToSocket() {
return openSocket("http://localhost:8080");
}
export default connectToSocket;
And here is the code executed when user select a chat:
// how i am importing the socket
import openSocket from "../../Services/socket-io"
async function fetchMessages(ticketId) {
try {
const { data } = await api.get("/messages/" + ticketId, {
params: { pageNumber },
});
if (ticketId === data.ticket.id) {
await loadMessages(data, ticketId);
}
listenMessages(ticketId);
} catch (err) {
Toast.ToastError("Error trying to load messages");
}
};
function listenMessages(ticketId) {
const socket = openSocket();
socket.on("connect", () => socket.emit("joinChatBox", ticketId));
socket.on("appMessage", (data) => {
if (data.action === "create") {
console.log(data);
}
if (data.action === "update") {
console.log(data);
}
});
}
What i already tried:
Use the same socket version both in client and server (3.0.5).
Calling listenMessages every time a message is sent.
socket.io-client version 4.
I have GraphQL Subscriptions on my Apollo server that I want to close after the user logs out. The initial question is whether we should close this (socket) connections on the client side or in the backend.
On the front-end, I am using Angular with Apollo Client and I handle GraphQL subscriptions by extending the Subscription class from apollo-angular. I am able to close the subscription channels with a typical takeUntil rxjs implementation:
this.userSubscription
.subscribe()
.pipe(takeUntil(this.subscriptionDestroyed$))
.subscribe(
({ data }) => {
// logic goes here
},
(error) => {
// error handling
}
);
However, this does not close the websocket on the server, which If I'm right, will result in a subscription memory leak.
The way the Apollo Server (and express) is set up for subscriptions is as follows:
const server = new ApolloServer({
typeDefs,
resolvers,
subscriptions: {
onConnect: (connectionParams, webSocket, context) => {
console.log('on connect');
const payload = getAuthPayload(connectionParams.accessToken);
if (payload instanceof Error) {
webSocket.close();
}
return { user: payload };
},
onDisconnect: (webSocket, context) => {
console.log('on Disconnect');
}
},
context: ({ req, res, connection }) => {
if (connection) {
// set up context for subscriptions...
} else {
// set up context for Queries, Mutations...
}
When the client registers a new GraphQL subscription, I always get to see console.log('on connect'); on the server logs, but I never see console.log('on Disconnect'); unless I close the front-end application.
I haven't seen any example on how to close the websocket for subscriptions with Apollo. I mainly want to do this to complete a Logout implementation.
Am I missing something here? Thanks in advance!
I based my solution based on this post
Essentially, the way we created the Subscription with sockets was using subscriptions-transport-ws
export const webSocketClient: SubscriptionClient = new
SubscriptionClient(
`${environment.WS_BASE_URL}/graphql`,
{
reconnect: true,
lazy: true,
inactivityTimeout: 3000,
connectionParams: () => ({
params: getParams()
})
}
);
As specified in the question, I wanted to unsubscribe all channels and close the subscription socket connection before logout of the user. We do this by using the webSocketClient SubscriptionClient in the logout function and call:
webSocketClient.unsubscribeAll();
webSocketClient.close();
Hello I'am completly new with React/Redux so there is a possibility that I violated some principles with the below code , so bare with me.
I'm building a React App which will consume my Express API. Everything is working perfectly but when I was building the Action Creators I couldnt think of a good way to handle any errors coming from the API without wrapping every single axios request with try/catch blocks.
Both in PHP world where I come from and Express you can create a global Error handler.
For any async requests in my Express APP I wrap them with the below function so I can catch them the same way as the synchronous.
module.exports = (fn) => {
return (req, res, next) => {
fn(req, res, next).catch((err) => next(err));
};
};
From what I've learned through googling is that, there is an ErrorBoundary HOC for handling errors inside Components and for axios calls I should use axios interceptors. So I created this:
AxiosFactory Class
import axios from "axios";
import { setError } from "../actions/utilActions";
import store from "../store";
class AxiosFactory {
constructor(baseURL) {
this.instance = axios.create({
baseURL,
});
this.instance.interceptors.response.use(
function (response) {
// Any status code that lie within the range of 2xx cause this function to trigger
// Do something with response data
return response;
},
function (error) {
// Any status codes that falls outside the range of 2xx cause this function to trigger
// Do something with response error
// Getting the errors from Express API
const {
response: {
data: { errors },
},
} = error;
store.dispatch(setError(errors));
return Promise.reject(error);
}
);
}
getInstance() {
return this.instance;
}
}
export default AxiosFactory;
User API Caller
import AxiosFactory from './AxiosFactory';
const axios = new AxiosFactory('/api/v1/users/').getInstance();
export default axios;
User ActionCreator
import { SUCCESS_LOGIN } from "./types/userTypes";
import userApi from "../apis/user";
// Tries to login the user
export const signInUser = () => {
return async (dispatch) => {
// Test
const {data:{data:{user} = await userApi.post("login", {
email: "test#test.com",
password: "test12345!",
});
dispatch({
type: SUCCESS_LOGIN,
payload: user,
});
}
Error ActionCreator
import { HAS_ERROR } from "./types/utilTypes";
export const setError = (errors) => {
return async (dispatch) => {
dispatch({
type: HAS_ERROR,
payload: errors,
});
};
};
The interceptor dispatches succesfuly the setError and the error state is getting updated like a charm, which means I dont need to manual dispatch on each call. Although I still need to catch the Promise rejection from Interceptor.
My 2 questions are:
Is there a way to lets say "stop the dispatch from executing" inside my User ActionCreator without try/catching the Promise ?
Does this whole thing I setup makes sense ? Or there is a better way to do it?
So I have designed a basic Publisher-Subscriber model using rhea in JS that takes an API request for saving data in DB and then publishes it to a queue.
From there a subscriber(code added below) picks it up and tries to save it in a DB. Now my issue is that this DB instance goes through a lot of changes during development period and can result in errors during insert operations.
So now when the subscriber tries to push to this DB and it results in an error, the data is lost since it was dequeued. I'm a total novice in JS so is there a way to make sure that a message isn't dequeued unless we are sure that it is saved properly without having to publish it again on error?
The code for my subscriber:
const Receiver = require("rhea");
const config = {
PORT: 5672,
host: "localhost"
};
let receiveClient;
function connectReceiver() {
const receiverConnection = Receiver.connect(config);
const receiver = receiverConnection.open_receiver("send_message");
receiver.on("connection_open", function () {
console.log("Subscriber connected through AMQP");
});
receiver.on("error", function (err) {
console.log("Error with Subscriber:", err);
});
receiver.on("message", function (element) {
if (element.message.body === 'detach') {
element.receiver.detach();
}
else if (element.message.body === 'close') {
element.receiver.close();
}
else {
//save in DB
}
}
receiveClient = receiver;
return receiveClient;
}
You can use code like this to explicitly accept the message or release it back to the sender:
try {
save_in_db(event.message);
event.delivery.accept();
} catch {
event.delivery.release();
}
See the delivery docs for more info.
In my frontend I'm using the sails.io.js wrapper. I'm trying to connect to my backend and listen for notifications. This works fine when I'm using blast to send out to all connected sockets, but broadcasting does nothing.
However, joining a room seems to work as the callback from the join function gets executed without any errors.
The frontend part:
import * as socketIoClient from 'socket.io-client'
import * as sailsIo from 'sails.io.js'
const io = sailsIo(socketIoClient)
io.sails.url = 'localhost:1337'
io.sails.environment = process.env.NODE_ENV || 'development'
io.sails.useCORSRouteToGetCookie = false
io.socket.get('/notification/join', (data, jwRes) => {
console.log('Server responded with status code ' + jwRes.statusCode + ' and data: ', data);
})
io.socket.on('notification', data => {
console.log('Server says: ', data);
})
My Sails Controller:
export const join = async (req, res) => {
if (!req.isSocket) {
return res.badRequest();
}
// Join a user specific notification room
sails.sockets.join(req, 'notification', error => {
if (!error) console.log('Everything went fine')
return res.send('Connected') // this works so far
})
// Send events each second
setInterval(() => {
sails.sockets.broadcast('notification', { data: 'Real notification' }) // This never works
sails.sockets.blast('notification', {data: 'BLAST'}) // This works perfectly
}, 1000)
}
Any suggestion on what goes wrong? Like I said, the callback of joining the room gets executed without an error, also the blast call works fine and the frontend is able to receive the message.