Refactoring gatsby-node File into separate files not working - javascript

Trying to refactor my gatsby-node file, by outsourcing a bit of code. Right now trying to do this in my gatsby-node:
const createBlogPostPages = require("./gatsby-utils/createBlogPostPages");
exports.createPages = async ({ actions, graphql, reporter }) => {
//...some code
await createBlogPostPages({ actions, graphql, reporter });
//...some code
}
and my createBlogPostPages, which is in a different file, looks like so:
const path = require("path");
module.exports = async function({ actions, graphql, reporter }) {
const { createPage } = actions;
const blogArticles = await graphql(`
{
allMdx(filter: { fileAbsolutePath: { regex: "/content/blog/.*/" } }) {
edges {
node {
id
fileAbsolutePath
fields {
slug
}
frontmatter {
title
tags
date
tagline
}
}
}
}
}
`);
blogArticles.data.allMdx.edges.forEach(({ node }) => {
let imageFileName = ... //some stuff
createPage({
path: `${node.fields.slug}`,
component: path.resolve(`./src/templates/blog-post.js`),
context: {
slug: `${node.fields.slug}`,
id: node.id,
imageFileName: imageFileName
}
});
});
};
All of this works when its directly in gatsby-node.
However, having moved stuff, I now get:
"gatsby-node.js" threw an error while running the createPages
lifecycle:
blogArticles is not defined
ReferenceError: blogArticles is not defined
gatsby-node.js:177 Object.exports.createPages
/Users/kohlisch/blogproject/gatsby-node.js:177:19
next_tick.js:68 process._tickCallback
internal/process/next_tick.js:68:7
So it looks like it's not waiting for the graphql query to resolve? Or what might this be? I just basically want to move a few things out of my gatsby-node file, into separate functions, so that its not so cluttered. Is this not possible?

There are two rules you need to follow when importing in gatsby-node.js:
1. Use node.js require syntax.
./src/components/util/gatsby-node-functions
const importedFunction = () => {
return Date.now();
};
module.exports.importedFunction = importedFunction;
gatsby-node.js
const { importedFunction } = require(`./src/components/util/gatsby-node-functions`);
// ...
// Use your imported functions
console.log(importedFunction());
Reference: Gatsby repo issue, also incluces hack how to use ES6 import statement if you want to add complexity just for using an import statement.
2. Do not pass gatsby-node.js specific attributes to your imported functions
If you attempt to outsource for example, your createPages function, actions will be undefined:
const importedFunction = (actions, node) => {
const {createPage} = actions; // actions is undefined
createPage({
path: `${node.fields.slug}`,
component: path.resolve(`./src/templates/blog-post.js`),
context: {
slug: `${node.fields.slug}`,
id: node.id,
}
});
};
module.exports.importedFunction = importedFunction;
Feel free to speculate why you cannot pass the attributes. The Gatsby documentation mentions "Redux" for handling state. Maybe Redux does not supply state outside your gatsby-node.js. Correct me if I'm wrong

Related

Why are dynamic imports unexpectedly coupled between tests when using mock-fs?

I'm trying to use mock-fs to unit test code which uses ES6 dynamic imports.
There seems to be an unexpected coupling between tests when I'm using dynamic imports, even though I call restore() after each test. It appears as though fs.readFile() behaves as expected between tests (no coupling), but await import() has coupling (it returns the result from the previous test).
I've created a minimal Jest test case that reproduces the issue. The tests pass individually, but not when run together. I notice that if I change the directory value so it's different between each test, then they pass together.
Can you help me understand why this doesn't work, whether it's a bug, and what I should do here?
import path from 'path';
import { promises as fs } from 'fs';
import mockFs from 'mock-fs';
const fsMockModules = {
node_modules: mockFs.load(path.resolve(__dirname, '../node_modules')),
};
describe('Reproduce dynamic import coupling between tests', () => {
afterEach(() => {
mockFs.restore();
});
it('first test', async () => {
const directory = 'some/path';
mockFs({
...fsMockModules,
[directory]: {
'index.js': ``,
},
});
await import(path.resolve(`${directory}/index.js`));
//not testing anything here, just illustrating the coupling for next test
});
it('second tests works in isolation but not together with first test', async () => {
const directory = 'some/path';
mockFs({
...fsMockModules,
[directory]: {
'index.js': `export {default as migrator} from './migrator.js';`,
'migrator.js':
'export default (payload) => ({...payload, xyz: 123});',
},
});
const indexFile = await fs.readFile(`${directory}/index.js`, 'utf-8');
expect(indexFile.includes('export {default as migrator}')).toBe(true);
const migrations = await import(path.resolve(`${directory}/index.js`));
expect(typeof migrations.migrator).toBe('function');
});
});

Jest - mocking non-default imported function

I need to mock an imported function in a React application (created using create-react-app) with Jest. I tried to do what is shown in the docs and in other threads (e.g. here: Jest – How to mock non-default exports from modules?), but it doesn't work. Currently my project looks like this:
calls getOne.js
import {getOne} from "../../Services/articleService";
export const a = async function () {
return getOne("article1");
};
articleService.js
export const getOne = async function (id) {
return;
};
testss.test.js
import {ArticleTypes} from "../../Data/articleTypes";
import {getOne} from "../../Services/articleService";
import {a} from "./calls getOne";
jest.mock("../../Services/articleService", () => {
return {
getOne: jest.fn().mockImplementation(async (id) => {
const articles = {
article1: {
title: "Some valid title",
description: "Some valid description",
type: /*ArticleTypes.General*/"a"
}
};
return Promise.resolve(articles[id]);
})
};
});
const mockedGetOne = async (id) => {
const articles = {
article1: {
title: "Some valid title",
description: "Some valid description",
type: ArticleTypes.General
}
};
return Promise.resolve(articles[id]);
};
beforeAll(() => {
getOne.mockImplementation(mockedGetOne);
});
test("calls getOne", async () => {
const res = await a();
expect(getOne).toHaveBeenCalled();
expect(res).not.toBeUndefined();
});
I have currently commented out ArticleTypes in the mock in the factory function used in jest.mock. I need to use them but I cannot use imported files inside the factory function. I just wanted to test if mocking would work there, but it does not work anywhere.
Why can't I mock the imports? Am I missing something? Do I need some additional configuration I don't know about?

How to create plugin for Nuxt.js?

This is my rpc.js plugin file:
const { createBitcoinRpc } = require('#carnesen/bitcoin-rpc')
const protocol = 'http'
const rpcuser = 'root'
const rpcpassword = 'toor'
const host = '127.0.0.1'
const port = '43782'
const rpcHref = `${protocol}://${rpcuser}:${rpcpassword}#${host}:${port}/`
const bitcoinRpc = createBitcoinRpc(rpcHref)
export default ({ app }, inject) => {
inject('bitcoinRpc', (method) =>
bitcoinRpc(method).then((result) => console.log('That was easy!', result))
)
}
This is my nuxt.config.js file:
...
plugins: [{ src: '#/plugins/gun.js' }, { src: '#/plugins/rpc.js' }],
...
If I call this.$bitcoinRpc('getnewaddress') somewhere in the component methods, then I get an error, but if I call this method inside the rpc plugin itself, then everything works as expected:
// plugins/rpc.js:
// Declare constants and inject above
...
bitcoinRpc('getnewaddress').then((result) =>
console.log('That was easy!', result)
)
I get the expected result in the terminal:
That was easy! 2N8LyZKaZn5womvLKZG2b5wGfXw8URSMptq 14:11:21
Explain what I'm doing wrong?
The method outlined by me was correct.
The error that occurred was caused by the fact that on the client side it was not possible to use the server side libraries.

info argument is empty in Apollo GraphQL resolver type signature

I'm working on this library https://github.com/ilyaskarim/wertik-js called Wertik JS to make GraphQL + Rest API more easily, In resolvers, when I console log info, it shows undefined. For each module, I have created dynamic resolvers to make things more easy for developers who will use this library.
let object = {
create: async (_:any, args:any, context:any,info: any) => {
console.log(info); // This will be undefined
let v = await validate(validations.create,args.input);
let {success} = v;
if (!success) {
throw new ApolloError("Validation error",statusCodes.BAD_REQUEST.number,{list: v.errors})
}
try {
let createModel = await model.create(args.input);
pubsub.publish(`${camelCase(moduleName)}Created`, { [`${camelCase(moduleName)}Created`]: createModel });
return createModel;
} catch (e) {
return internalServerError(e);
}
},
}
Line: https://github.com/ilyaskarim/wertik-js/blob/ec813f49a14ddd6a04680b261ae4ef2aadc2b1a5/src/framework/dynamic/resolvers.ts#L102
The info is described in Apollo Server Documentation https://www.apollographql.com/docs/apollo-server/essentials/data/#resolver-type-signature, Which says: This argument contains information about the execution state of the query, including the field name, the path to the field from the root, and more. For me, unfortunately, it is getting undefined.
To reproduce the issue:
Download https://github.com/ilyaskarim/wertik-js/tree/development
Yarn install
Go to examples/demo
Run node index.js
Now go to http://localhost:1209/
Enter this mutation for example:
mutation {
createRole(input: {name: "Asd"}) {
name
}
}
This line executes on this mutation https://github.com/ilyaskarim/wertik-js/blob/ec813f49a14ddd6a04680b261ae4ef2aadc2b1a5/src/framework/dynamic/resolvers.ts#L102
And returns undefined on the console.
This is how I setup the application:
const { ApolloServer } = require('apollo-server');
import mutations from "./loadAllMutations";
import queries from "./loadAllQueries";
import resolvers from "./loadAllResolvers";
import subscriptions from "./loadAllSubscriptions";
import schemas from "./loadAllSchemas";
import generalSchema from "./../helpers/generalSchema";
export default function (rootDirectory: string,app: any,configuration: object) {
let allMutations = mutations(rootDirectory);
let allQueries= queries(rootDirectory);
let allSchemas = schemas(rootDirectory);
let allResolvers = resolvers(rootDirectory);
let allSubscriptions = subscriptions(rootDirectory);
let {validateAccessToken} = require(`${rootDirectory}/framework/predefinedModules/user/auth`).default;
let mainSchema = `
${generalSchema}
${allSchemas}
type Subscription {
${allSubscriptions}
}
type Mutation {
${allMutations}
}
type Query {
${allQueries}
}
schema {
query: Query
mutation: Mutation
subscription: Subscription
}
`;
const server = new ApolloServer({
typeDefs: mainSchema,
resolvers: allResolvers,
context: async (a: any) => {
await validateAccessToken(a.req);
}
});
server.listen(1209).then(({ url, subscriptionsUrl }) => {
console.log(`Server ready at ${url}`);
console.log(`Subscriptions ready at ${subscriptionsUrl}`);
});
}
What could be a possible reason?
You're truncating the parameters received by the resolvers inside this module. If you need to assign a function to some object property, it's much better to just do it like this:
mutations: {
[`create${moduleName}`]: mutations[`create${moduleName}`],
},
This is not only more succinct, but it also means you don't risk accidentally leaving off a parameter, which is what happened here.

How to get rid of use absolute urls error when trying to async action creators in React?

I keep getting the error that I need to use absolute URL's from fetch when I am trying to mock the test call with fetchMock and nock.
describe('fetchFileNames creator #AT-fetchTodosIfNeeded#', () => {
it('should create RECEIVE_FILE_NAMES_SUCCESS after the fetching is done', () => {
const fileNames = ['testFile1', 'testFile2', 'testFile3'];
const expectedActions = [
{ type: ac.REQUEST_FILE_NAMES },
{ type: ac.RECEIVE_FILE_NAMES_SUCCESS, fileNames }
];
const store = mockStore({
files: {
fileNames
}
});
fetchMock.get('*', { files: fileNames});
return store.dispatch(at.fetchFileNames())
.then(() => {
var createdActions = store.getActions();
delete createdActions[1].receivedAt;
expect(store.getActions()).to.deep.equal(expectedActions);
});
});
});
The code turned out to be okay. I was importing isomorphic-fetch incorrectly into the async action creators file. What I was doing: import fetch from isomorphic-fetch what I should have been doing: import isomorphic-fetch. It's documented in fetch-mock, but I missed it and this was getting frustrating.

Categories

Resources