How to create plugin for Nuxt.js? - javascript

This is my rpc.js plugin file:
const { createBitcoinRpc } = require('#carnesen/bitcoin-rpc')
const protocol = 'http'
const rpcuser = 'root'
const rpcpassword = 'toor'
const host = '127.0.0.1'
const port = '43782'
const rpcHref = `${protocol}://${rpcuser}:${rpcpassword}#${host}:${port}/`
const bitcoinRpc = createBitcoinRpc(rpcHref)
export default ({ app }, inject) => {
inject('bitcoinRpc', (method) =>
bitcoinRpc(method).then((result) => console.log('That was easy!', result))
)
}
This is my nuxt.config.js file:
...
plugins: [{ src: '#/plugins/gun.js' }, { src: '#/plugins/rpc.js' }],
...
If I call this.$bitcoinRpc('getnewaddress') somewhere in the component methods, then I get an error, but if I call this method inside the rpc plugin itself, then everything works as expected:
// plugins/rpc.js:
// Declare constants and inject above
...
bitcoinRpc('getnewaddress').then((result) =>
console.log('That was easy!', result)
)
I get the expected result in the terminal:
That was easy! 2N8LyZKaZn5womvLKZG2b5wGfXw8URSMptq 14:11:21
Explain what I'm doing wrong?

The method outlined by me was correct.
The error that occurred was caused by the fact that on the client side it was not possible to use the server side libraries.

Related

Context is undefined in translation module

I tried to add a call to an endpoint in order to get translation. I have like this :
const loadLocales = async () => {
const context = require.context('./locales', true);
const data = await ApiService.post(`${translationToolUrl}/gateway/translations`, { project: 'myProject' });
const messages = context.keys()
.map((key) => ({ key, locale: key.match(/[-a-z0-9_]+/i)[0] }))
.reduce((msgs, { key, locale }) => ({
...msgs,
[locale]: extendMessages(context(key)),
}), {});
return { context, messages };
};
const { context, messages } = loadLocales();
i18n = new VueI18n({
locale: 'en',
fallbackLocale: 'en',
silentFallbackWarn: true,
messages,
});
if (module.hot) {
module.hot.accept(context.id, () => {
const { messages: newMessages } = loadLocales();
Object.keys(newMessages)
.filter((locale) => messages[locale] !== extendMessages(newMessages[locale]))
.forEach((locale) => {
const msgs = extendMessages(newMessages[locale]);
messages[locale] = msgs;
i18n.setLocaleMessage(locale, msgs);
});
});
}
I added this request : ApiService.post. But I have the error TypeError: context is undefined droped at this line module.hot.accept(context.id.... Have you an idea how I can solve that ? My scope was to add this request in order to get translations from database and from .json files for now. I want to do a merge between both for now, in the feature I will get only from database but this will be done step by step.
The problem is, that you trying to declare multiple const in the wrong way, independently of trying to declaring them twice. This shows in:
const { context, messages } = loadLocales();
This would couse context and messages to be undefined. This won´t give an error, as I replicated in a small example:
const {first, second} = 'Testing'
console.log(first)
console.log(second)
Both, first and second, will be undefined. If you try to declare multiple const at once, you need to do it this way:
const context = loadLocales(), messages = loadLocales();

Refactoring gatsby-node File into separate files not working

Trying to refactor my gatsby-node file, by outsourcing a bit of code. Right now trying to do this in my gatsby-node:
const createBlogPostPages = require("./gatsby-utils/createBlogPostPages");
exports.createPages = async ({ actions, graphql, reporter }) => {
//...some code
await createBlogPostPages({ actions, graphql, reporter });
//...some code
}
and my createBlogPostPages, which is in a different file, looks like so:
const path = require("path");
module.exports = async function({ actions, graphql, reporter }) {
const { createPage } = actions;
const blogArticles = await graphql(`
{
allMdx(filter: { fileAbsolutePath: { regex: "/content/blog/.*/" } }) {
edges {
node {
id
fileAbsolutePath
fields {
slug
}
frontmatter {
title
tags
date
tagline
}
}
}
}
}
`);
blogArticles.data.allMdx.edges.forEach(({ node }) => {
let imageFileName = ... //some stuff
createPage({
path: `${node.fields.slug}`,
component: path.resolve(`./src/templates/blog-post.js`),
context: {
slug: `${node.fields.slug}`,
id: node.id,
imageFileName: imageFileName
}
});
});
};
All of this works when its directly in gatsby-node.
However, having moved stuff, I now get:
"gatsby-node.js" threw an error while running the createPages
lifecycle:
blogArticles is not defined
ReferenceError: blogArticles is not defined
gatsby-node.js:177 Object.exports.createPages
/Users/kohlisch/blogproject/gatsby-node.js:177:19
next_tick.js:68 process._tickCallback
internal/process/next_tick.js:68:7
So it looks like it's not waiting for the graphql query to resolve? Or what might this be? I just basically want to move a few things out of my gatsby-node file, into separate functions, so that its not so cluttered. Is this not possible?
There are two rules you need to follow when importing in gatsby-node.js:
1. Use node.js require syntax.
./src/components/util/gatsby-node-functions
const importedFunction = () => {
return Date.now();
};
module.exports.importedFunction = importedFunction;
gatsby-node.js
const { importedFunction } = require(`./src/components/util/gatsby-node-functions`);
// ...
// Use your imported functions
console.log(importedFunction());
Reference: Gatsby repo issue, also incluces hack how to use ES6 import statement if you want to add complexity just for using an import statement.
2. Do not pass gatsby-node.js specific attributes to your imported functions
If you attempt to outsource for example, your createPages function, actions will be undefined:
const importedFunction = (actions, node) => {
const {createPage} = actions; // actions is undefined
createPage({
path: `${node.fields.slug}`,
component: path.resolve(`./src/templates/blog-post.js`),
context: {
slug: `${node.fields.slug}`,
id: node.id,
}
});
};
module.exports.importedFunction = importedFunction;
Feel free to speculate why you cannot pass the attributes. The Gatsby documentation mentions "Redux" for handling state. Maybe Redux does not supply state outside your gatsby-node.js. Correct me if I'm wrong

Dynamically export a module by reading all files in a directory in Node.js

So today i was trying read all default exports
from some directory which has index.js. Try to wrap it inside one object and export it back again. Is there a better way to handle this ?
export default (() => require('fs')
.readdirSync(__dirname)
.filter(fileName => !!/.js$/ig.test(fileName))
.map(fileName => fileName.split('.')[0])
.reduce((defaultExportObj, nextFileName) => {
try {
return {
...defaultExportObj,
[nextFileName]: require(__dirname + `/${nextFileName}`),
};
}catch(err) { throw err; }
}, {}))();
I guess i'd do something like this - not sure if this is better - w/e better is ^^
webpack: require.context
function expDefault(path, mode = "sync"){
const modules = {}
const context = require.context(path, false, /\.js$/, mode)
context.keys().forEach(file => {
const name = fileName.replace(/^.+\/([^/]+)\.js$/, "$1")
modules[name] = context(name).default
})
return modules
}
export default expDefault(__dirname)

How to make Dotenv and Nock work together?

Currently in one of my apps i am using nock to mock my api request.Unfortunately in another test file of same project, i used dotenv.If i use dotenv my nock is not mocking the url, it is using the original api request.
Any suggestions or help is appreciated.
My test file
'use strict';
const assert = require('assert');
const nock = require('nock');
describe('example', () => {
afterEach(async() => {
nock.cleanAll();
});
describe("checktest", () => {
it("checksomeupdate", async() => {
nock('http://example.com')
.get('/demend-point')
.reply(200, {
x: 1
})
const result = await demoCallToMainFileMetho();
const [a, b] = result || []; // response array [1,2,3]
assert.ok(a.includes('1'));
});
});
});
My other file in test dir
require('dotenv').config();
.....some code
My issue is fixed.
Solution: I had to remove dotenv package from my script.Where ever i needed that i had to replace with
process.env = Object.assign(process.env,
{ DEMO_VARIABLE_ONE: 'false' },
{ DEMO_VARIABLE_ONE_URL: 'value' },
{ DEMO_VARIABLE_TWO: 'value' }
);

How to get rid of use absolute urls error when trying to async action creators in React?

I keep getting the error that I need to use absolute URL's from fetch when I am trying to mock the test call with fetchMock and nock.
describe('fetchFileNames creator #AT-fetchTodosIfNeeded#', () => {
it('should create RECEIVE_FILE_NAMES_SUCCESS after the fetching is done', () => {
const fileNames = ['testFile1', 'testFile2', 'testFile3'];
const expectedActions = [
{ type: ac.REQUEST_FILE_NAMES },
{ type: ac.RECEIVE_FILE_NAMES_SUCCESS, fileNames }
];
const store = mockStore({
files: {
fileNames
}
});
fetchMock.get('*', { files: fileNames});
return store.dispatch(at.fetchFileNames())
.then(() => {
var createdActions = store.getActions();
delete createdActions[1].receivedAt;
expect(store.getActions()).to.deep.equal(expectedActions);
});
});
});
The code turned out to be okay. I was importing isomorphic-fetch incorrectly into the async action creators file. What I was doing: import fetch from isomorphic-fetch what I should have been doing: import isomorphic-fetch. It's documented in fetch-mock, but I missed it and this was getting frustrating.

Categories

Resources