Engine not found for the ".js" file extension - javascript

I want to use koa-views with Koa and Koa-Router with Next.js. In previous projects, I had no issues with express but in this project, I have to use Koa. Using its router, I want to render a page: /some/page/:id. Following the same Nextjs way:
router.get('/some/page/:id', async (ctx, next) => {
const actualPage = '/some/page/id' // id.js (not actual name 😝)
await ctx.render(actualPage, {/* could pass object */})
});
That would work if I was using express. With Koa:
const Koa = require('koa');
const views = require('koa-views');
// const render = require('koa-views-render'); <-- I what's this?
[..] // Making things short here
const server = new Koa();
const router = new Router();
// My issue, I'm seeing tutorials using other engines: .ejs etc
// I'm not using any, I only have .js files
server.use(views(__dirname + "/pages", { extension: 'js' }));
Using the same router.get... function as above, I get:
Error: Engine not found for the ".js" file extension
When I go to /some/page/123, I'd expect it to render the file /pages/some/page/id.js. How?

It turns out I do not need any extra modules to achieve this 🙀
Create a function called, ie, routes then pass app and router as a param
const routes = (router, app) => {
router.get('/some/page/:id', async (ctx) => {
const { id } = ctx.params
const actualPage = '/some/page/id'
// Render the page
await app.render(ctx.req, ctx.res, actualPage, {foo: 'Bar'})
}
}
module.exports = routes
Inside your server.js file:
// const routes = require('./routes);
// const app = next({ dev }); // import other modules for this section
// app.prepare().then(() => {
// const router = new Router();
// [..]
// routes(router, app)
// })
The commented out section is a slim down version to make a point in where things should be.

Related

How to pass koa-session data to page components in NextJS?

In my NextJS project, I created a custom server with koa + koa-session, so that I can have some session data per each request, like the code below,
import next from "next";
import Koa from "koa";
import Session from "koa-session";
import Router from "koa-router";
...
const next_app = next({...});
const handle = next_app.getRequestHandler();
next_app.prepare().then(async () => {
const server = new Koa();
const router = new Router();
server.use(Session(server)); // use koa-session middleware
...
router.get("(.*)", async (ctx) => {
console.log("server.js session: ", ctx.session);
...
// create or update session data
ctx.session.custom_data += 123;
// but how to pass ctx.session to handle(), then to page components?
await handle(ctx.req, ctx.res);
});
As said in the comment, how to pass ctx.session data to page components?

How Can I Serve Static Content Alongside Dynamic Routes in A Deno Oak Server

I am used to working with NodeJS and Koa. I've been playing with Deno and have run the example of a static fileserver:
/* static_server.js */
import { Application } from 'https://deno.land/x/oak/mod.ts'
const port = 8080
const app = new Application()
// Error handler middleware
app.use(async (context, next) => {
try {
await next()
} catch (err) {
console.error(err)
}
})
// Send static content
app.use(async (context) => {
console.log(`${context.request.method} ${context.request.url.pathname}`)
await context.send({
root: `${Deno.cwd()}/static`,
index: "index.html",
})
})
await app.listen({ port })
I have also created a dynamic server using routes:
/* routes.js */
import { Application, Router } from 'https://deno.land/x/oak/mod.ts'
const port = 8080
const app = new Application()
const router = new Router()
router.get('/', context => {
context.response.body = 'Hello world!'
})
router.get('/foo', context => {
context.response.body = 'Book Page'
})
router.get('/foo/:thing', context => {
context.response.body = `Foo ${context.params.thing}`
})
app.use(router.routes())
app.use(router.allowedMethods())
await app.listen({ port })
How can I combine these so that I can serve dynamic content but also provide static files such as the stylesheet?
In my Koa code I use the koa-static package:
import serve from 'koa-static'
app.use(serve('public'))
What is the equivalent for an Oak server?
Adding suggested code (thanks Jonas Wilms)
/* static_content.js */
import { Application, Router } from 'https://deno.land/x/oak/mod.ts'
const port = 8080
const app = new Application()
const router = new Router()
router.get('/', context => {
context.response.body = 'Hello world!'
})
router.get('/foo', context => {
context.response.body = 'Book Page'
})
router.get('/foo/:thing', context => {
context.response.body = `Foo ${context.params.thing}`
})
router.get(context => context.send({ root: `${Deno.cwd()}/static` }))
app.use(router.routes())
app.use(router.allowedMethods())
await app.listen({ port })
but this still does not work...
After combining a lot of the information in the comments I managed to get things working:
/* static_content.js */
import { Application, Router, Status } from 'https://deno.land/x/oak/mod.ts'
const port = 8080
const app = new Application()
const router = new Router()
// error handler
app.use(async (context, next) => {
try {
await next()
} catch (err) {
console.log(err)
}
})
// the routes defined here
router.get('/', context => {
context.response.body = 'Hello world!'
})
router.get('/error', context => {
throw new Error('an error has been thrown')
})
app.use(router.routes())
app.use(router.allowedMethods())
// static content
app.use(async (context, next) => {
const root = `${Deno.cwd()}/static`
try {
await context.send({ root })
} catch {
next()
}
})
// page not found
app.use( async context => {
context.response.status = Status.NotFound
context.response.body = `"${context.request.url}" not found`
})
app.addEventListener("listen", ({ port }) => console.log(`listening on port: ${port}`) )
await app.listen({ port })
I know I'm a bit late on the thread, but there are some things I would like to point out.
In Oak 10.1 (the current version at the time of this writing), the send function throws an error if the file it tired to load did not exist. Thus, our static+dynamic server can take on the following form.
import { oak, pathUtils } from './deps.ts'
const app = new oak.Application()
const router = new oak.Router()
app.use(async (ctx, next) => {
try {
await oak.send(ctx, ctx.request.url.pathname, {
root: 'static',
index: 'index.html',
})
} catch (_) {
await next()
}
})
router.get('/dynamic', ctx => {
ctx.response.body = 'dynamic route worked'
})
app.use(router.allowedMethods())
app.use(router.routes())
app.listen({ port: 8000 })
If you want to serve your static files at a certain root path, change the static middleware so that it checks for the root and then omits that root path from the second argument of the send function.
function staticMiddleware(rootPath: string, staticDirectory: string) {
return async (ctx, next) => {
if (!ctx.request.url.pathname.startsWith(rootPath)) return await next()
const newPath = ctx.request.url.pathname.slice(rootPath.length)
if (!newPath.startsWith('/') && newPath.length) return await next()
try {
await oak.send(ctx, newPath, {
root: staticDirectory,
index: 'index.html',
})
} catch (_) {
await next()
}
}
}
app.use(staticMiddleware('/assets', 'static'))
I think you should use the static router at last. Because when use static server first, dynamic router is nonreachable for static router error.
app.use(router.routes())
app.use(router.allowedMethods())
// move the static router down
app.use( async context => {
context.response.status = Status.NotFound
context.response.body = `"${context.request.url}" not found`
})
Not sure whether this is still relevant or already outdated, but as of now (August 2022), there seems to be no general answer to this.
Serving Static Files Alongside Your API Using Oak/Deno
When setting up OpenAPI for an oak-based REST service, I was coming across this issue as well. Requirements were:
Serve openapi.yml statically from /openapi/openapi.yml
Serve a HTML statically from /openapi for convenience
Serve prefixed routers unaffectedly
A straight-forward approach to serve static files from a certain directory under a sub-path of the application is using a middleware and checking the path:
import {
Application, Context, Router
} from 'https://deno.land/x/oak#v11.1.0/mod.ts';
const app = new Application();
const port = 3000;
// Middleware only hooking in and sending static files if prefix matches
// the desired subpath:
const openapi = async (ctx: Context, next: () => Promise<unknown>) => {
const prefix = '/openapi'; // Sub-path to react on
if (ctx.request.url.pathname.startsWith(prefix)) {
await ctx.send({
root: `${Deno.cwd()}/src/openapi/`, // Local directory to serve from
index: 'index.html',
path: ctx.request.url.pathname.replace(prefix, ''), // Map to target path
});
} else {
// If the request does not match the prefix, just continue serving from
// whatever comes next..
await next();
}
};
// This is a dummy API endpoint wrapped into a prefixed router for demo:
const statusRouter = new Router({ prefix: '/status' });
statusRouter.get('/', (ctx: Context) => {
ctx.response.body = {
healthy: true,
ready: true,
};
});
// Boilerplate..
app.use(openapi);
app.use(statusRouter.routes());
app.use(statusRouter.allowedMethods());
app.addEventListener('listen', () => {
console.log(`Listening on localhost:${port}`);
});
await app.listen({ port });
Running this MWE using deno run --allow-net --allow-env --allow-read src/serve.ts, you'll
find the statically served /openapi/openapi.yml,
find the index.html from your local static path served under /openapi (resp. /openapi/ and /openapi/index.html)
find the /status API behaving just normally.
i'm using like that. In html you can provide a path to your file:
<script src="/js/app.js"></script>
then you can use routes to provide what do you want to use on path js/app.js:
import {RouterContext} from 'https://deno.land/x/oak/mod.ts'
const decoder = new TextDecoder("utf-8")// set doecoder
const fileCont = await Deno.readFile('./views/test.js') //getting file conetent
const fileJS = decoder.decode(fileCont) //decoding it
router.get('/js/app.js', (ctx: RouterContext) => { //yep, route can has defferents of real file location
ctx.response.type = "application/javascript"
ctx.response.body = fileJS
})
and whatever you are providing this link somewhere it'll render you file.
Deno REST API

ReferenceError when using MongoDB Collection variable in external resolver file that was imported via mergeResolvers

This is a totally reduced example to better explain the issue! So when I use the resolver Query getAllUsers, the MongoDB Collection Users is not available in the external resolver file user.js. So when I send that query I get:
ReferenceError: Users is not defined
That's a correct behaviour. But I do not want to include all the resolvers in my index.js, because I have a better modularization in this way. So I have all my typedefs and resolvers in external files like this.
Current file structure
index.js
/graphql
/typdef
user.graphql
/resolver
user.js
The user.graphql schema is correctly working. It is just the user.js that is producing the error when I execute the query with the not available Users variable, as already said.
Here the index.js and user.js.
index.js
import express from 'express'
import cors from 'cors'
const app = express()
app.use(cors())
import bodyParser from 'body-parser'
import {graphqlExpress, graphiqlExpress} from 'graphql-server-express'
import {makeExecutableSchema} from 'graphql-tools'
import {fileLoader, mergeTypes, mergeResolvers} from 'merge-graphql-schemas';
import {writeFileSync} from 'fs'
const typeDefs = mergeTypes(fileLoader(`${__dirname}/graphql/typedef/*.graphql`), { all: true })
writeFileSync(`${__dirname}/graphql/typedef.graphql`, typeDefs)
export const start = async () => {
try {
const MONGO_URL = 'mongodb://localhost:27017'
const MongoClient = require('mongodb').MongoClient;
MongoClient.connect(MONGO_URL, function(err, client) {
console.log("Connected successfully to server");
const db = client.db('project');
const Users = db.collection('user')
});
const URL = 'http://localhost'
const homePath = '/graphql'
const PORT = 3001
app.use(
homePath,
bodyParser.json(),
graphqlExpress({schema})
)
app.use(homePath,
graphiqlExpress({
endpointURL: homePath
})
)
app.listen(PORT, () => {
console.log(`Visit ${URL}:${PORT}${homePath}`)
})
} catch (e) {
console.log(e)
}
}
user.js
export default {
Query: {
getAllUsers: async () => {
return (await Users.find({}).toArray()).map(prepare)
}
}
}
What is the best way to pass the MongoDB or the Users collection to the resolver files. Or is there an even better solution for this issue?
First of all, this is NOT a proper solution, because declaring global variables while outsourcing schema is a bad design at all. But it works out and maybe this way someone gets an idea about how to improve this fix.
So to solve the issue all I had to do is changing the variable from local const to global.
So in index.js const Users = db.collection('user') is rewritten by global.Users = db.collection('user').
Same for the user.js. Here return (await Users.find({}).toArray()).map(prepare) is rewritten by return (await global.Users.find({}).toArray()).map(prepare).

How can I pass options into an imported module?

I have a utility module that creates an instance of a multer-gridfs storage engine for uploading files to my Mongo database. I use this module inside of any API route that requires the need to upload files.
I need to be able to update the metadata property value with a unique identifier. More than likely this will be the mongoose _id of the user uploading the file, but for now I am not concerned with that aspect of it. I really just want to know if I can change the metadata property dynamically.
Here is the storage engine gridFs_upload_engine.js:
const mongoose = require('mongoose');
const path = require('path');
const crypto = require('crypto');
const multer = require('multer');
const GridFsStorage = require('multer-gridfs-storage');
const Grid = require('gridfs-stream');
//Init Upload Engine
let gfs;
//Global instance of the DB connection
const database = mongoose.connection;
const mongoDb = process.env.MONGODB_URI || process.env.MLAB_URL;
database.once('open', () => {
//Init Stream
gfs = Grid(database.db, mongoose.mongo);
gfs.collection('uploads');
});
//Create Storage Engine
const storage = new GridFsStorage({
url: mongoDb,
file: (res, file) => {
return new Promise((resolve, reject) => {
crypto.randomBytes(16, (err, buf) => {
if (err) {
return reject(err);
}
const filename = buf.toString('hex') + path.extname(file.originalname);
const fileInfo = {
filename: filename,
bucketName: 'uploads',
metadata: 'NEED TO UPDATE THIS'
};
resolve(fileInfo);
});
});
}
});
const uploadEngine = multer({ storage });
module.exports = {
uploadEngine,
gfs
};
Above you can see the metadata property that I need to be able to dynamically change with some undetermined unique identifier. Is it possible to do that with an exported file?
Here is how I am utilizing it inside of an API route:
const express = require('express');
const router = express.Router();
//Controllers
const upload_controller = require('../../controllers/uploader');
//Utilities
const upload = require('../../utils/gridFs_upload_engine');
const { uploadEngine } = upload;
//Upload Single File
router.post(
'/single',
uploadEngine.single('file'),
upload_controller.upload_single_file
);
//Upload Multiple Files
//Max file uploads at once set to 30
router.post(
'/multiple',
uploadEngine.array('file', 30),
upload_controller.upload_multiple_files
);
module.exports = router;
I pass the uploadEngine into the API route here, so that the route controller can use it, and that works with no issue. I am just having quite a time trying to figure out how to update metatdata dynamically and I am leaning towards my current implementation not allowing for that.
I don't know much about node and have now idea what multer-gridfs is but I can answer How can I pass options into an imported module?
You can export an function that returns another function. And you would import it like
const configFunction = require('nameoffile')
// this returns a functions with the config you want
const doSomethingDependingOnTheConfig = configFunction({...someConfig})
And in the file you are importing you would have a function returning another function like
const configFunction = ({...someConfig}) => (your, func) => {
// do what you want deppending on the config
}
module.exports = configFunction
I know this doesn't answer your question the way you want, but answer you question title and I hope this give you a better understanding of how to do what you want to do.
If this doesn't help, just let me know.
You would need to pass a parameter to the module gridFs_upload_engine.js and do the magic there.
An example could be:
In gridFs_upload_engine.js file:
function uploadEngine (id, file) {
// update what you want
}
module.exports = {
...
uploadEngine: uploadEngine
}
In your router:
const upload = require('../../utils/gridFs_upload_engine')
...
router.post('/single/:id', function(req, res, next) {
...
upload.uploadEngine(req.params.id, file)
next()
}, upload_controller.upload_single_file)
In other words, when you are exposing gfs and uploadEngine inside your module, you could instead expose a function that would receive the arguments needed to perform the upload.

How to use Browser history in react router with koa

In the express, we can just use following codes to deal with the request. The server side will send index.html when the request that isn't handled by router.
app.get('*', function (request, response){
response.sendFile(path.resolve(__dirname, '../public', 'index.html'))
})
But in koa, the following code don't work. When the request isn't handled by koa-router, it will return 404 instead of index.html.
var send = require('koa-send')
var serve = require('koa-static')
var router = require('koa-router')
var koa = require('koa')
var app = koa();
app.use(serve(__dirname+'/../public'));
app.use(function *(){
yield send(this, path.join(__dirname, '/../public/','index.html' )); })
app.use(router.routes())
following code also don't work
router
.get('*', function* () {
yield send(this, __dirname +'/../public/index.html')
})
router.get('*', async function(ctx, next) {
var html = fs.readFileSync(path.resolve('./build/index.html'));
ctx.type = 'html';
ctx.body = html;
})
this works for me
Essentially what you're trying to achieve is server-rendering.
You need to write route configuration with match & RouterContext. react-router has detailed documentation for this.
Server Rendering in react-router
In case of koa, it can roughly be done in this way.
import router from 'koa-router'
import { match, RouterContext } from 'react-router'
const koaRouter = router()
const otherRouter = () => {
return new Promise((resolve, reject) => {
match({ routes, location }, (error, redirectLocation, renderProps) => {
...
..
.
}
}
koaRouter
.use(otherRouter)
I found couple of repos online which seem pretty decent. I haven't verified them though.
breko-hub
koa-react-isomoprhic

Categories

Resources