Issues when testing Epic with TestScheduler - javascript

I'm using an rxjs epic as a middleware for an async action in a react-redux app.
I'm trying to simulate an ajax request (through a dependency injection) and test the behavior of this epic based on the response.
This is my epic :
export const loginEpic = (action$, store$, { ajax }) => { // Ajax method is injected
return action$.ofType(LoginActions.LOGIN_PENDING).pipe(
mergeMap(action => {
if (action.mail.length === 0) {
return [ loginFailure(-1) ]; // This action is properly returned while testing
} else {
return ajax({ ... }).pipe(
mergeMap(response => {
if (response.code !== 0) {
console.log(response.code); // This is logged
return [ loginFailure(response.code) ]; // This action is expected
} else {
return [ loginSuccess() ];
}
}),
catchError(() => {
return [ loginFailure(-2) ];
})
);
}
})
);
};
This part test if the mail adress is empty and works just fine (Or at least just as expected):
it("empty mail address", () => {
testScheduler.run(({ hot, expectObservable }) => {
let action$ = new ActionsObservable(
hot("a", {
a: {
type: LoginActions.LOGIN_PENDING,
mail: ""
}
})
);
let output$ = loginEpic(action$, undefined, { ajax: () => ({}) });
expectObservable(output$).toBe("a", {
a: {
type: LoginActions.LOGIN_FAILURE,
code: -1
}
});
});
});
However, I have this second test that fails because the actual value is an empty array (There is no login failed returned):
it("wrong credentials", () => {
testScheduler.run(({ hot, cold, expectObservable }) => {
let action$ = new ActionsObservable(
hot("a", {
a: {
type: LoginActions.LOGIN_PENDING,
mail: "foo#bar.com"
}
})
);
let dependencies = {
ajax: () =>
from(
new Promise(resolve => {
let response = {
code: -3
};
resolve(response);
})
)
};
let output$ = loginEpic(action$, undefined, dependencies);
expectObservable(output$).toBe("a", {
a: {
type: LoginActions.LOGIN_FAILURE,
code: -3
}
});
});
});
Any idea on what I'm doing wrong / why this part returns an empty array (The console.log does actually log the code):
if (response.code !== 0) {
console.log(response.code);
return [ loginFailure(response.code) ];
}
While this part returns a populated array:
if (action.mail.length === 0) {
return [ loginFailure(-1) ];
}

I'm guessing the use of Promise is causing the test to actually be asynchronous. Try changing the stub of ajax to use of(response) instead of from

Related

How to test custom yup validation by jest?

I am trying to create unit tests for the custom yup validation method by jest with correct and incorrect data.
function nextValueBigger(message = 'Some error') {
return this.test('nextValueBigger', (value, { path }) => {
const errors = value.map((item, index) => {
if (item?.from < item?.to) { return null; } // Successful validation, no error
return new ValidationError(
message,
null,
`${path}[${index}].to`,
);
}).filter(Boolean);
if (errors.length === 0) { return true; }
return new ValidationError(errors);
});
}
I tried with creating simulation of validation like this:
const data = [
{ from: 1, to: 2 },
{ from: 3, to: 4 },
{ from: 5, to: 6 },
];
const schema = Yup.object().shape({
item: Yup.array().of(Yup.object().shape({
from: Yup.number(),
to: Yup.number(),
}))
.nextValueBigger(),
});
it('should not found any errors', async () => {
const result = await schema.validateAt('items', { items: data }
expect(result).toEqual(true)
}

How to use discriminated union properly for function return

I'm new to using typescript. I was wondering how to utilize discriminated union for function return. I have an async function that calls 2 endpoints, and I want to correctly type the return value based on each api call results.
So far I managed to created these types:
interface GetDetailSuccess {
status: "success";
data: MovieDetailResult;
}
interface GetDetailFail {
status: "failed";
error: any;
}
interface GetCastSuccess {
status: "success";
data: MovieCastResult
}
interface GetCastFail {
status: "failed";
error: any;
}
type MovieDetail = GetDetailSuccess | GetDetailFail;
type MovieCast = GetCastSuccess | GetCastFail;
type ReturnValue = {
movieDetail: MovieDetail;
movieCast: MovieCast;
};
Here is the simplified version of the function that I managed to create so far:
export const getMovieDetailAndCast = async ():Promise<ReturnValue> => {
const movDet = {} as MovieDetail;
const movCas = {} as MovieCast;
await Promise.allSettled([
api.getMovieDetail(),
api.getMovieCast(),
])
.then((responses) => {
responses.forEach((res, index) => {
if (index === 0) {
if (res.status === "fulfilled") {
movDet.status = "success";
if (movDet.status === "success") {
movDet.data = res.value.data;
}
}
if (res.status === "rejected") {
movDet.status = "failed";
if (movDet.status === "failed") {
movDet.error = res.reason.response.data.status_message;
}
}
}
if (index === 1) {
if (res.status === "fulfilled") {
movCas.status = "success";
if (movCas.status === "success") {
movCas.data = res.value.data;
}
}
if (res.status === "rejected") {
movCas.status = "failed";
if (movCas.status === "failed") {
movCas.error = res.reason.response.data.status_message;
}
}
}
});
})
.catch((err) => {
console.error(err);
});
return {
movieDetail: movDet,
movieCast: movCas,
};
}
So far the IDE doesn't yell at me for any error, but I do wonder if what I am doing is correct. Especially the part on how to narrowing the type and the part where I assigned an empty objects using as. Is there anything that I could to improve the coding above? Any feedback would be appreciated
It's not clear what you want to achieve using different types for success and failure. Usually, the easiest way to get if it is a success or failure is checking if the error property is defined or if the status property is "success".
I think you should narrow your types like so to reduce complexity:
export const enum GetStatus {
PENDING = "pending",
SUCCESS = "success",
FAILED = "failed"
}
export interface GetDetail {
status: GetStatus;
data: MovieDetailResult | undefined;
error: any;
}
export interface GetCast {
status: GetStatus;
data: MovieCastResult | undefined;
error: any;
}
export interface DetailAndCast {
movieDetail: GetDetail;
movieCast: GetCast;
};
Then, if you follow this suggestion you might want to keep track of your call with a "pending" status.
export const getMovieDetailAndCast: () => Promise<DetailAndCast> = async () => {
const movieDetail: GetDetail = {status: GetStatus.PENDING, data: undefined, error: undefined};
const movieCast: GetCast = {status: GetStatus.PENDING, data: undefined, error: undefined};
await Promise.all([
api.getMovieDetail(),
api.getMovieCast(),
])
.then(responses => {
responses.foreach((res, index) => {
const returnValue = index === 0 ? movieDetail : movieCast;
if (res.status === "fulfilled") {
returnValue.status = GetStatus.SUCCESS;
returnValue.data = res.value.data;
} else if (res.status === "rejected") {
returnValue.status = GetStatus.FAILED;
returnValue.error = res.reason.response.data.status_message;
}
});
})
.catch(err => console.error(err));
return {movieDetail, movieCast};
}

How I can make asynchronous queries in GraphQL?

I'm calling 1 query and mutation. Mutation works fine, but when I get response from my query I need to redirect user to another page, but In my case, the function is triggered before I get response. How can I prevent this?
const renderData = async () => {
const currentUserId = await data?.signInUserSession?.idToken
?.payload?.sub;
const isAdmin = await data?.signInUserSession?.idToken?.payload[
"custom:role"
];
localStorage.setItem("userId", currentUserId);
if (
currentUserId !== null &&
currentUserId !== undefined &&
currentUserId !== ""
) {
Auth.currentSession().then((data) => {
setData({
variables: {
updateUserInput: {
id: currentUserId,
firstName: data.getIdToken().payload.given_name,
lastName: data.getIdToken().payload.family_name,
},
},
});
});
isCodeValid({
variables: {
validateUserVerificationCodeInput: {
user: {
id: currentUserId,
},
},
},
});
if (isAdmin === "admin" && isUserCodeValid) {
history.push("/managements");
} else if (
isUserCodeValid !== undefined &&
isUserCodeValid === true
) {
history.push("/verification");
} else if (isUserCodeValid) {
history.push("/stripe");
}
}
};
isUserCodeValid - is a response from query
useMutation has onCompleted and refetchQueries options for such cases. It is hard to write an exact solution for your case since not all code is visible but an example like below can help, I believe:
const [addProduct, { data, loading, error }] = useMutation(
createProductMutation
);
const onFinish = async (fieldNames) => {
await addSpending({
variables: { ...others, ...fieldNames},
refetchQueries: [{ query: calledQuery }],
onCompleted: (data) => {
// your logic
},
});
if (!error) {
form.resetFields();
onFinishSave(true);
}
};

action creator does not return value to stream in marble test

I've got following Epic which works well in application, but I can't get my marble test working. I am calling action creator in map and it does return correct object into stream, but in the test I am getting empty stream back.
export const updateRemoteFieldEpic = action$ =>
action$.pipe(
ofType(UPDATE_REMOTE_FIELD),
filter(({ payload: { update = true } }) => update),
mergeMap(({ payload }) => {
const { orderId, fields } = payload;
const requiredFieldIds = [4, 12]; // 4 = Name, 12 = Client-lookup
const requestData = {
id: orderId,
customFields: fields
.map(field => {
return (!field.value && !requiredFieldIds.includes(field.id)) ||
field.value
? field
: null;
})
.filter(Boolean)
};
if (requestData.customFields.length > 0) {
return from(axios.post(`/customfields/${orderId}`, requestData)).pipe(
map(() => queueAlert("Draft Saved")),
catchError(err => {
const errorMessage =
err.response &&
err.response.data &&
err.response.data.validationResult
? err.response.data.validationResult[0]
: undefined;
return of(queueAlert(errorMessage));
})
);
}
return of();
})
);
On successfull response from server I am calling queueAlert action creator.
export const queueAlert = (
message,
position = {
vertical: "bottom",
horizontal: "center"
}
) => ({
type: QUEUE_ALERT,
payload: {
key: uniqueId(),
open: true,
message,
position
}
});
and here is my test case
describe("updateRemoteFieldEpic", () => {
const sandbox = sinon.createSandbox();
let scheduler;
beforeEach(() => {
scheduler = new TestScheduler((actual, expected) => {
expect(actual).toEqual(expected);
});
});
afterEach(() => {
sandbox.restore();
});
it("should return success message", () => {
scheduler.run(ts => {
const inputM = "--a--";
const outputM = "--b--";
const values = {
a: updateRemoteField({
orderId: 1,
fields: [{ value: "test string", id: 20 }],
update: true
}),
b: queueAlert("Draft Saved")
};
const source = ActionsObservable.from(ts.cold(inputM, values));
const actual = updateRemoteFieldEpic(source);
const axiosStub = sandbox
.stub(axios, "post")
.returns([]);
ts.expectObservable(actual).toBe(outputM, values);
ts.flush();
expect(axiosStub.called).toBe(true);
});
});
});
output stream in actual returns empty array
I tried to return from map observable of the action creator which crashed application because action expected object.
By stubbing axios.post(...) as [], you get from([]) in the epic - an empty observable that doesn't emit any values. That's why your mergeMap is never called. You can fix this by using a single-element array as stubbed value instead, e.g. [null] or [{}].
The below is an answer to a previous version of the question. I kept it for reference, and because I think the content is useful for those who attempt to mock promise-returning functions in epic tests.
I think your problem is the from(axios.post(...)) in your epic. Axios returns a promise, and the RxJS TestScheduler has no way of making that synchronous, so expectObservable will not work as intended.
The way I usually address this is to create a simple wrapper module that does Promise-to-Observable conversion. In your case, it could look like this:
// api.js
import axios from 'axios';
import { map } from 'rxjs/operators';
export function post(path, data) {
return from(axios.post(path, options));
}
Once you have this wrapper, you can mock the function to return a constant Observable, taking promises completely out of the picture. If you do this with Jest, you can mock the module directly:
import * as api from '../api.js';
jest.mock('../api.js');
// In the test:
api.post.mockReturnValue(of(/* the response */));
Otherwise, you can also use redux-observable's dependency injection mechanism to inject the API module. Your epic would then receive it as third argument:
export const updateRemoteFieldEpic = (action$, state, { api }) =>
action$.pipe(
ofType(UPDATE_REMOTE_FIELD),
filter(({ payload: { update = true } }) => update),
mergeMap(({ payload }) => {
// ...
return api.post(...).pipe(...);
})
);
In your test, you would then just passed a mocked api object.

rxjs subscribing late results to empty stream

I have the following piece of code. As is, with a couple of lines commented out, it works as expected. I subscribe to a stream, do some processing and stream the data to the client. However, if I uncomment the comments, my stream is always empty, i.e. count in getEntryQueryStream is always 0. I suspect it has to do with the fact that I subscribe late to the stream and thus miss all the values.
// a wrapper of the mongodb driver => returns rxjs streams
import * as imongo from 'imongo';
import * as Rx from 'rx';
import * as _ from 'lodash';
import {elasticClient} from '../helpers/elasticClient';
const {ObjectId} = imongo;
function searchElastic({query, sort}, limit) {
const body = {
size: 1,
query,
_source: { excludes: ['logbookType', 'editable', 'availabilityTag'] },
sort
};
// keep the search results "scrollable" for 30 secs
const scroll = '30s';
let count = 0;
return Rx.Observable
.fromPromise(elasticClient.search({ index: 'data', body, scroll }))
.concatMap(({_scroll_id, hits: {hits}}) => {
const subject = new Rx.Subject();
// subject needs to be subscribed to before adding new values
// and therefore completing the stream => execute in next tick
setImmediate(() => {
if(hits.length) {
// initial data
subject.onNext(hits[0]._source);
// code that breaks
//if(limit && ++count === limit) {
//subject.onCompleted();
//return;
//}
const handleDoc = (err, res) => {
if(err) {
subject.onError(err);
return;
}
const {_scroll_id, hits: {hits}} = res;
if(!hits.length) {
subject.onCompleted();
} else {
subject.onNext(hits[0]._source);
// code that breaks
//if(limit && ++count === limit) {
//subject.onCompleted();
//return;
//}
setImmediate(() =>
elasticClient.scroll({scroll, scrollId: _scroll_id},
handleDoc));
}
};
setImmediate(() =>
elasticClient.scroll({scroll, scrollId: _scroll_id},
handleDoc));
} else {
subject.onCompleted();
}
});
return subject.asObservable();
});
}
function getElasticQuery(searchString, filter) {
const query = _.cloneDeep(filter);
query.query.filtered.filter.bool.must.push({
query: {
query_string: {
query: searchString
}
}
});
return _.extend({}, query);
}
function fetchAncestors(ancestorIds, ancestors, format) {
return imongo.find('session', 'sparse_data', {
query: { _id: { $in: ancestorIds.map(x => ObjectId(x)) } },
fields: { name: 1, type: 1 }
})
.map(entry => {
entry.id = entry._id.toString();
delete entry._id;
return entry;
})
// we don't care about the results
// but have to wait for stream to finish
.defaultIfEmpty()
.last();
}
function getEntryQueryStream(entriesQuery, query, limit) {
const {parentSearchFilter, filter, format} = query;
return searchElastic(entriesQuery, limit)
.concatMap(entry => {
const ancestors = entry.ancestors || [];
// if no parents => doesn't match
if(!ancestors.length) {
return Rx.Observable.empty();
}
const parentsQuery = getElasticQuery(parentSearchFilter, filter);
parentsQuery.query.filtered.filter.bool.must.push({
terms: {
id: ancestors
}
});
// fetch parent entries
return searchElastic(parentsQuery)
.count()
.concatMap(count => {
// no parents match query
if(!count) {
return Rx.Observable.empty();
}
// fetch all other ancestors that weren't part of the query results
// and are still a string (id)
const restAncestorsToFetch = ancestors.filter(x => _.isString(x));
return fetchAncestors(restAncestorsToFetch, ancestors, format)
.concatMap(() => Rx.Observable.just(entry));
});
});
}
function executeQuery(query, res) {
try {
const stream = getEntryQueryStream(query);
// stream is passed on to another function here where we subscribe to it like:
// stream
// .map(x => whatever(x))
// .subscribe(
// x => res.write(x),
// err => console.error(err),
// () => res.end());
} catch(e) {
logger.error(e);
res.status(500).json(e);
}
}
I don't understand why those few lines of code break everything or how I could fix it.
Your use case is quite complex, you can start off with building up searchElastic method like the pattern bellow.
convert elasticClient.scroll to an observable first
setup the init data for elasticClient..search()
when search is resolved then you should get your scrollid
expand() operator let you recursively execute elasticClientScroll observable
use map to select data you want to return
takeWhile to decide when to complete this stream
The correct result will be once you do searchElastic().subscribe() the stream will emit continuously until there's no more data to fetch.
Hope this structure is correct and can get you started.
function searchElastic({ query, sort }, limit) {
const elasticClientScroll = Observable.fromCallback(elasticClient.scroll)
let obj = {
body: {
size: 1,
query,
_source: { excludes: ['logbookType', 'editable', 'availabilityTag'] },
sort
},
scroll: '30s'
}
return Observable.fromPromise(elasticClient.search({ index: 'data', obj.body, obj.scroll }))
.expand(({ _scroll_id, hits: { hits } }) => {
// guess there are more logic here .....
// to update the scroll id or something
return elasticClientScroll({ scroll: obj.scroll, scrollId: _scroll_id }).map(()=>
//.. select the res you want to return
)
}).takeWhile(res => res.hits.length)
}

Categories

Resources