I'm brand new to Aurelia.
How would you change the following code to provide a dummy HttpClient, e.g. a json reader instead that would provide just a static set of json data, negating the need for a server in development.
import {inject} from 'aurelia-framework';
import {HttpClient} from 'aurelia-fetch-client';
#inject(HttpClient)
export class Users {
heading = 'Github Users';
users = [];
constructor(http) {
http.configure(config => {
config
.useStandardConfiguration()
.withBaseUrl('https://api.github.com/');
});
this.http = http;
}
activate() {
return this.http.fetch('users')
.then(response => response.json())
.then(users => this.users = users);
}
}
There's a couple steps required to get the demo code in your original post to a state where we can substitute HttpClient implementations.
Step 1
Remove the configuration code in the class's constructor...
These lines:
users.js
...
http.configure(config => {
config
.useStandardConfiguration()
.withBaseUrl('https://api.github.com/');
});
...
Should move to the main.js file:
main.js
export function configure(aurelia) {
aurelia.use
.standardConfiguration()
.developmentLogging();
configureContainer(aurelia.container); // <--------
aurelia.start().then(a => a.setRoot());
}
function configureContainer(container) {
let http = new HttpClient();
http.configure(config => {
config
.useStandardConfiguration()
.withBaseUrl('https://api.github.com/');
});
container.registerInstance(HttpClient, http); // <---- this line ensures everyone that `#inject`s a `HttpClient` instance will get the instance we configured above.
}
Now our users.js file should look like this:
users.js
import {inject} from 'aurelia-framework';
import {HttpClient} from 'aurelia-fetch-client';
#inject(HttpClient)
export class Users {
heading = 'Github Users';
users = [];
constructor(http) {
this.http = http;
}
activate() {
return this.http.fetch('users')
.then(response => response.json())
.then(users => this.users = users);
}
}
Step 2:
Mock the HttpClient.
The user.js module only uses the fetch method which returns a Response object that has a json method. Here's a simple mock:
let mockUsers = [...todo: create mock user data...];
let httpMock = {
fetch: url => Promise.resolve({
json: () => mockUsers
})
};
Step 3:
Reconfigure the container to use the http mock:
In step 1 we added a configureContainer function to the main.js module that registered a configured HttpClient instance in the container. If we wanted to use our mock version the configureContainer function would change to this:
main.js
...
let mockUsers = [...todo: create mock user data...];
let httpMock = {
fetch: url => Promise.resolve({
json: () => mockUsers
})
};
function configureContainer(container) {
container.registerInstance(HttpClient, httpMock);
}
More info on configuring the container here: https://github.com/aurelia/dependency-injection/issues/73
There is another possibility to provide static data for the application during development. Navigation Skeleton already comes with Gulp and BrowserSync, so we used those to fake API calls.
Let's say you load JSON data from /api virtual directory, so e.g.
GET /api/products
In this case your just need two things to fake it.
Put your mock data into files
Go to the root folder of your Aurelia app and create an /api folder.
Create a /api/products subfolder and put a new file called GET.json. This file should contain the JSON, e.g.
GET.json
[ { "id": 1, "name": "Keyboard", "price": "60$" },
{ "id": 2, "name": "Mouse", "price": "20$" },
{ "id": 3, "name": "Headphones", "price": "80$" }
]
Configure BrowserSync to mock your API calls
Navigate to /build/tasks folder and edit the serve.js file. Change the definition of serve task to the following code:
gulp.task('serve', ['build'], function(done) {
browserSync({
online: false,
open: false,
port: 9000,
server: {
baseDir: ['.'],
middleware: function(req, res, next) {
res.setHeader('Access-Control-Allow-Origin', '*');
// Mock API calls
if (req.url.indexOf('/api/') > -1) {
console.log('[serve] responding ' + req.method + ' ' + req.originalUrl);
var jsonResponseUri = req._parsedUrl.pathname + '/' + req.method + '.json';
// Require file for logging purpose, if not found require will
// throw an exception and middleware will cancel the retrieve action
var jsonResponse = require('../..' + jsonResponseUri);
// Replace the original call with retrieving json file as reply
req.url = jsonResponseUri;
req.method = 'GET';
}
next();
}
}
}, done);
});
Now, when your run gulp serve, BrowserSync will be handling your API calls and serving them from the static files on disk.
You can see an example in my github repo and more description in my Mocking API calls in Aurelia.
Related
I'm familiar with Svelte but completely new to Sveltekit. I'm trying to build a Sveltekit app from scratch using AWS Cognito as the authorization tool without using AWS Amplify, using the amazon-cognito-identity-js sdk. I've got all the functionality working as far as login, registration, and verification, but I can't seem to get a handle on how to store the session data for the structure I've built.
I've been trying to translate the strategies from this tutorial, based in React, to Sveltekit -- (AWS Cognito + React JS Tutorial - Sessions and Logging out (2020) [Ep. 3]) https://www.youtube.com/watch?v=R-3uXlTudSQ
and this REPL to understand using context in Svelte ([AD] Combining the Context API with Stores) https://svelte.dev/repl/7df82f6174b8408285a1ea0735cf2ff0
To elaborate, I've got my structure like so (only important parts shown):
src
|
|-- components
|-- ...
|-- status.svelte
|-- routes
|
|-- dashboard
|-- onboarding
|-- __layout.reset.svelte
|-- login.svelte
|-- signup.svelte
|-- verify.svelte
|-- ...
|-- settings
|-- __layout.svelte
|-- index.svelte
|-- styles
|-- utils
|-- cognitoTools.ts
|-- stores.ts
I wanted to have a separate path for my onboarding pages, hence the sub-folder. My cognito-based functions reside within cognitoTools.ts. An example of a few functions look like:
export const Pool = new CognitoUserPool(poolData);
export const User = (Username: string): any => new CognitoUser({ Username, Pool });
export const Login = (username: string, password: string): any => {
return new Promise ((resolve, reject) => User(username).authenticateUser(CognitoAuthDetails(username, password), {
onSuccess: function(result) {
console.log('CogTools login success result: ', result);
resolve(result)
},
onFailure: function(err) {
console.error('CogTools login err: ', err);
reject(err)
}
}))
}
I'm able to then use the methods freely anywhere:
// src/routes/onboarding/login.svelte
import { Login, Pool } from '#utils/cognitoTools'
import { setContext, getContext } from 'svelte'
let username;
let password;
let session = writeable({});
let currentSession;
// Setting our userSession store to variable that will be updated
$: userSession.set(currentSession);
// Attempt to retrieve getSession func defined from wrapper component __layout.svelte
const getSession = getContext('getSession');
const handleSubmit = async (event) => {
event.preventDefault()
Login(username, password, rememberDevice)
.then(() => {
getSession().then((session) => {
// userSession.set(session);
currentSession = session;
setContext('currentSession', userSession);
})
})
}
...
// src/routes/__layout.svelte
...
const getSession = async () => {
return await new Promise((resolve, reject) => {
const user = Pool.getCurrentUser();
if (user) {
user.getSession((err, session) => {
console.log('User get session result: ', (err ? err : session));
err ? reject() : resolve(session);
});
} else {
console.log('get session no user found');
reject();
}
})
}
setContext('getSession', getSession)
Then, I've been trying to retrieve the session in src/components/status.svelte or src/routes/__layout.svelte (as I think I understand context has to be set in the top level components, and can then be used by indirect child components) to check if the context was set correctly.
Something like:
let status = false;
const user = getContext('currentSession');
status = user ? true : false;
I'm running in circles and I know I'm so close to the answer. How do I use reactive context with my current file structure to accomplish this?
I don't know much about the sdk, so I can't help you with your code above. But I also built an app that uses cognito for auth, and I can share some snippets on how to do it from scratch.
Implement a login form. I have my basic app skeleton (navbar, footer, main slot) in _layout.svelte, and it is configured to show the Login.svelte component and not the main slot if the user is not logged in.
file: __layout.svelte
<script context="module">
export async function load({ session }) {
return {
props: {
user: session.user,
}
}
}
</script>
<script>
import "../app.css";
import Login from "$components/Login.svelte";
export let user
</script>
<svelte:head>
<title>title</title>
</svelte:head>
{#if user}
<header>
</header>
<main>
<slot />
</main>
{:else}
<Login />
{/if}
file: Login.svelte
<form action="/" method="GET">
<input type="hidden" name="action" value="signin" />
<button type="submit" >Sign in</button>
</form>
Handle the login
I choose to do this as a svelte endpoint paired with the index. It keeps the routing super simple. You could do separate login.js and logout.js endpoints if you prefer. Just change your url in the form above.
file: index.js
import { v4 as uuid } from '#lukeed/uuid'
import db from '$lib/db'
const domain = import.meta.env.VITE_COGNITO_DOMAIN
const clientId = import.meta.env.VITE_COGNITO_CLIENT_ID
const redirectUri = import.meta.env.VITE_COGNITO_REDIRECT_URI
const logoutUri = import.meta.env.VITE_COGNITO_LOGOUT_URI
export const get = async (event) => {
const action = event.url.searchParams.get('action')
if (action === 'signin') {
// Hard to guess random string. Used to protect against forgery attacks.
// Should add check in callback that the state matches to prevent forgery
const state = uuid()
return {
status: 302,
headers: {
location: `https://${domain}/oauth2/authorize?response_type=code&client_id=${clientId}&redirect_uri=${redirectUri}&scope=openid+email+profile&state=${state}`,
},
}
}
if (action === 'signout') {
// delete this session from database
if (event.locals.session_id) {
await db.sessions.deleteMany({
where: { session_id: event.locals.session_id }
})
}
return {
status: 302,
headers: {
location: `https://${domain}/logout?client_id=${clientId}&logout_uri=${logoutUri}`
}
}
}
return {}
}
Handle the callback from AWS cognito. Again, I have the callback simply point to the root url. All authentication for me is handled at "/". The heavy lifting is done by hooks.js. This is your SK middleware. It for me is the sole arbiter of the user's authentication state, just because I like to keep it easy for me to understand.
file: hooks.js
import { v4 as uuid } from '#lukeed/uuid'
import cookie from 'cookie'
import db from '$lib/db'
const domain = import.meta.env.VITE_COGNITO_DOMAIN
const clientId = import.meta.env.VITE_COGNITO_CLIENT_ID
const clientSecret = import.meta.env.VITE_COGNITO_CLIENT_SECRET
const redirectUri = import.meta.env.VITE_COGNITO_REDIRECT_URI
const tokenUrl = `https://${domain}/oauth2/token`
const profileUrl = `https://${domain}/oauth2/userInfo`
export const handle = async ({ event, resolve }) => {
const cookies = cookie.parse(event.request.headers.get('cookie') || '')
event.locals.session_id = cookies.session_id // this will be overwritten by a new session_id if this is a callback
if (event.locals.session_id) {
// We have a session cookie, check to see if it is valid.
// Do this by checking against your session db or session store or whatever
// If not valid, or if it is expired, set event.locals.session_id to null
// This will cause the cookie to be deleted below
// In this example, we just assume it's valid
}
if ( (!event.locals.session_id) && event.url.searchParams.get('code') && event.url.searchParams.get('state') ) {
// No valid session cookie, check to see if this is a callback
const code = event.url.searchParams.get('code')
const state = event.url.searchParams.get('state')
// Change this to try, catch for error handling
const token = await getToken(code, state)
if (token != null) {
let cognitoUser = await getUser(token)
event.locals.session_id = uuid()
// Add the value to the db
await db.sessions.create({
data: {
session_id: event.locals.session_id,
user_id: cognitoUser.username,
created: Date()
},
})
let user = await db.users.findUnique({
where: {
user_id: cognitoUser.username,
}
})
event.locals.user = user
event.locals.authorized = true
}
}
const response = await resolve(event);
// This will delete the cookie if event.locals.session_id is null
response.headers.set(
'set-cookie',
cookie.serialize('session_id', event.locals.session_id, {
path: '/',
httpOnly: true,
sameSite: 'strict',
maxAge: 60 * 60 * 24 * 7, // one week
})
)
return response;
}
export async function getSession(event) {
return {
user: event.locals.user,
}
}
const getToken = async (code, state) => {
let authorization = Buffer.from(`${clientId}:${clientSecret}`).toString('base64')
const res = await fetch(tokenUrl, {
method: 'POST',
headers: {
'Content-Type': 'application/x-www-form-urlencoded',
Authorization: `Basic ${authorization}`,
},
body: `grant_type=authorization_code&client_id=${clientId}&code=${code}&state=${state}&redirect_uri=${redirectUri}`
})
if (res.ok) {
const data = await res.json()
return data.access_token
} else {
return null
}
}
const getUser = async (token) => {
const res = await fetch(profileUrl, {
headers: {
Authorization: `Bearer ${token}`,
},
})
if (res.ok) {
return res.json()
} else {
return null
}
}
Lastly, getting the auth state. In a client-side route, this is done via the page load function. I put this in __layout to have it available on all .svelte routes. You can see it at the top of the __layout file above. For SSR endpoints, you can just access event.locals directly.
NOTE: All of the env vars are set in your .env file and will be imported by vite. This only happens when you start your app, so if you add/change them, you need to restart it.
I don't know if this helps at all since it is so different from your app structure, but maybe you will get some ideas from it.
I don't know what the problem is you run into exactly, but one thing that stands out to me is that you are calling setContext when it's "too" late. You can only call getContext/setContext within component initialization. See this answer for more details: Is there a way to use svelte getContext etc. svelte functions in Typescript files?
If this is the culprit and you are looking for a way how to get the session then: Use context in combination with stores:
<!-- setting the session -->
<script>
// ...
const session = writable(null);
setContext('currentSession', session);
// ...
Login...then(() => ...session.set(session));
</script>
<!-- setting the session -->
<script>
// ..
const user = getContext('currentSession');
// ..
status = $user ? true : false;
</script>
Another thing that stands out to me - but is too long/vague for a StackOverflow answer - is that you are not using SvelteKit's features to achieve this behavior. You could look into load and use stuff in __layout to pass the session down to all children. I'm no sure if this is of any advantage for you though since you are maybe planning to do a SPA anyway and therefore don't need such SvelteKit features.
So I have written code to handle this, essentially it looks for any placeholder.controller.ts file in my project and uses the exported array of controllers to add routing. Controllers have to be default exported in an array with a specific format.
the format looks like this:
const controller1 = {
endpoint: '/hello/world',
method: 'get',
controller: () => console.log('hello world!'),
}
export default [
controller1
];
The routing code that handles all of this, exists in a routes.ts file and looks like this:
import glob from 'glob';
import path from 'path';
import { Router } from 'express';
import { toArray } from '../lib/utilities/generic-utilities';
import { isRouteType, isArrayWithContent } from '../lib/utilities/type-checking';
import { skip } from '../lib/middleware/generic-middleware';
import { Route } from '../meta/#types/common-types';
import { secureRoutesConstant, extension } from './settings';
import secureRoute from '../lib/middleware/secure-route';
const router: any = Router({ mergeParams: true });
// relative path from routes file to controllers folder.
const controllersPath = '../http/controllers/';
const addRouteToRouter = (route: Route, filename: string) => {
const acceptableRoute: object | boolean = isRouteType(route);
const message: string = `issue with route while exporting a controller in file ${filename}\nroute supplied was:`;
if (!acceptableRoute) console.log(message, route);
if (!acceptableRoute) return;
const { endpoint, controller, method, isSecure = secureRoutesConstant } = route;
const { middlewareBefore = [], middlewareAfter = [] } = route;
const makeRouteSecure: Function = isSecure ? secureRoute : skip;
const middlewareBeforeArr: Function[] = toArray(middlewareBefore);
const middlewareAfterArr: Function[] = toArray(middlewareAfter);
const routeArguments: Function[] = [
...middlewareBeforeArr,
makeRouteSecure,
controller,
...middlewareAfterArr,
];
router.route(endpoint)[method](...routeArguments);
};
const addToRouterForEach = (allRoutes: Route[], filename: string) =>
allRoutes.forEach((route: Route) => addRouteToRouter(route, filename));
glob
.sync('**/*.ts', { cwd: path.join(`${__dirname}/`, controllersPath) })
.filter((filename: string) => filename.split('.').includes('controller'))
.map((filename: string) => ({ defaultsObj: require(`${controllersPath}${filename}`), filename }))
.filter(({ defaultsObj }) => isArrayWithContent(defaultsObj.default))
.forEach(({ defaultsObj, filename }) => addToRouterForEach(defaultsObj.default, filename));
export default router;
And is simply imported into app.ts and used like this:
app.use('/api', router)
Essentially this means I have no routing code as it's all handled for me, I only have to write my services, controllers and models.
Is there any performance or security issues with doing things like this, or with the code itself?
Is there any performance or security issues with doing things like this, or with the code itself?
Performance
No. The auto code will only run on boot and even if it takes a second its not a cost you are paying on individual client request route handling.
Security
The code architecture is secure by itself and does not increase your risk of vulnerabilites.
I need a graphql client lib to run on node.js for some testing and some data mashup - not in a production capacity. I'm using apollo everywhere else (react-apollo, apollo's graphql-server-express). My needs are pretty simple.
Is apollo-client a viable choice? I can find no examples or docs on using it on node - if you're aware of any, please share.
Or maybe I should/can use the reference graphql client on node?
Apollo Client should work just fine on Node. You only have to install cross-fetch.
Here is a complete TypeScript implementation of Apollo Client working on Node.js.
import { ApolloClient, gql, HttpLink, InMemoryCache } from "#apollo/client";
import { InsertJob } from "./graphql-types";
import fetch from "cross-fetch";
const client = new ApolloClient({
link: new HttpLink({ uri: process.env.PRODUCTION_GRAPHQL_URL, fetch }),
cache: new InMemoryCache(),
});
client.mutate<InsertJob.AddCompany, InsertJob.Variables>({
mutation: gql`mutation insertJob($companyName: String!) {
addCompany(input: { displayName: $companyName } ) {
id
}
}`,
variables: {
companyName: "aaa"
}
})
.then(result => console.log(result));
Newer Apollo version provide a simpler approach to perform this, as described in Apollo docs, check the section "Standalone". Basically one can simply use ApolloLink in order to perform a query or mutation.
Below is copy of the example code from the docs as of writing this, with node-fetch usage as config to createHttpLink. Check the docs for more details on how to use these tools.
import { execute, makePromise } from 'apollo-link';
import { createHttpLink } from 'apollo-link-http';
import gql from 'graphql-tag';
import fetch from 'node-fetch';
const uri = 'http://localhost:4000/graphql';
const link = createHttpLink({ uri, fetch });
const operation = {
query: gql`query { hello }`,
variables: {} //optional
operationName: {} //optional
context: {} //optional
extensions: {} //optional
};
// execute returns an Observable so it can be subscribed to
execute(link, operation).subscribe({
next: data => console.log(`received data: ${JSON.stringify(data, null, 2)}`),
error: error => console.log(`received error ${error}`),
complete: () => console.log(`complete`),
})
// For single execution operations, a Promise can be used
makePromise(execute(link, operation))
.then(data => console.log(`received data ${JSON.stringify(data, null, 2)}`))
.catch(error => console.log(`received error ${error}`))
If someone is looking for a JavaScript version:
require('dotenv').config();
const gql = require('graphql-tag');
const ApolloClient = require('apollo-boost').ApolloClient;
const fetch = require('cross-fetch/polyfill').fetch;
const createHttpLink = require('apollo-link-http').createHttpLink;
const InMemoryCache = require('apollo-cache-inmemory').InMemoryCache;
const client = new ApolloClient({
link: createHttpLink({
uri: process.env.API,
fetch: fetch
}),
cache: new InMemoryCache()
});
client.mutate({
mutation: gql`
mutation popJob {
popJob {
id
type
param
status
progress
creation_date
expiration_date
}
}
`,
}).then(job => {
console.log(job);
})
You can make apollo-client work, but it's not the best option for this use case.
Try graphql-request instead.
Minimal GraphQL client supporting Node and browsers for scripts or simple apps
Features per npmjs:
Most simple & lightweight GraphQL client
Promise-based API (works with async / await)
Typescript support
Isomorphic (works with Node / browsers)
example:
import { request, gql } from 'graphql-request'
const query = gql`
{
Movie(title: "Inception") {
releaseDate
actors {
name
}
}
}
`
request('https://api.graph.cool/simple/v1/movies', query).then((data) => console.log(data))
I have no affiliation with this package.
Here is simple node js implementation.
'graphiql' client is good enough for development activities.
1. run npm install
2. start server with "node server.js"
3. hit "http://localhost:8080/graphiql" for graphiql client
server.js
var graphql = require ('graphql').graphql
var express = require('express')
var graphQLHTTP = require('express-graphql')
var Schema = require('./schema')
// This is just an internal test
var query = 'query{starwar{name, gender,gender}}'
graphql(Schema, query).then( function(result) {
console.log(JSON.stringify(result,null," "));
});
var app = express()
.use('/', graphQLHTTP({ schema: Schema, pretty: true, graphiql: true }))
.listen(8080, function (err) {
console.log('GraphQL Server is now running on localhost:8080');
});
schema.js
//schema.js
var graphql = require ('graphql');
var http = require('http');
var StarWar = [
{
"name": "default",
"gender": "default",
"mass": "default"
}
];
var TodoType = new graphql.GraphQLObjectType({
name: 'starwar',
fields: function () {
return {
name: {
type: graphql.GraphQLString
},
gender: {
type: graphql.GraphQLString
},
mass: {
type: graphql.GraphQLString
}
}
}
});
var QueryType = new graphql.GraphQLObjectType({
name: 'Query',
fields: function () {
return {
starwar: {
type: new graphql.GraphQLList(TodoType),
resolve: function () {
return new Promise(function (resolve, reject) {
var request = http.get({
hostname: 'swapi.co',
path: '/api/people/1/',
method: 'GET'
}, function(res){
res.setEncoding('utf8');
res.on('data', function(response){
StarWar = [JSON.parse(response)];
resolve(StarWar)
console.log('On response success:' , StarWar);
});
});
request.on('error', function(response){
console.log('On error' , response.message);
});
request.end();
});
}
}
}
}
});
module.exports = new graphql.GraphQLSchema({
query: QueryType
});
In response to #YakirNa 's comment:
I can't speak to the other needs I described, but I have done a fair amount of testing. I ended up doing all of my testing in-process.
Most testing ends up being resolver testing, which I do via a jig that invokes the graphql library's graphql function with a test query and then validates the response.
I also have an (almost) end-to-end test layer that works at the http-handling level of express. It creates a fake HTTP request and verifies the response in-process. This is all within the server process; nothing goes over the wire. I use this lightly, mostly for testing JWT authentication and other request-level behavior that's independent of the graphql request body.
I was running into your same question, because I wanted to create a middleware service to prepare data from graphQL to a final frontend application,
to have :
optimised data representation (and standard output data interface)
faster response time
assuming that graphQL server is provided by an external provider , so no ownership to data model, directly with GQL
So I didn't want to implement GraphQL Apolloclient directly in a frontend framework like React / Angular, Vuejs... but manage the queries via Nodejs at backend of a REST API.
So this is the class wrapper for Apolloclient I was able to assemble (using typescript):
import ApolloClient from "apollo-client";
import { ApolloLink } from 'apollo-link'
import { HttpLink } from 'apollo-link-http'
import { onError } from 'apollo-link-error'
import fetch from 'node-fetch'
import { InMemoryCache, IntrospectionFragmentMatcher } from 'apollo-cache-inmemory'
import introspectionQueryResultData from '../../fragmentTypes.json';
import { AppConfig } from 'app-config';
const config: AppConfig = require('../../../appConfig.js');
export class GraphQLQueryClient {
protected apolloClient: any;
constructor(headers: { [name: string]: string }) {
const api: any = {
spaceId: config.app.spaceId,
environmentId: config.app.environmentId,
uri: config.app.uri,
cdnApiPreviewToken: config.cdnApiPreviewToken,
};
// console.log(JSON.stringify(api));
const ACCESS_TOKEN = api.cdnApiPreviewToken;
const uri = api.uri;
console.log(`Apollo client setup to query uri: ${uri}`);
const fragmentMatcher = new IntrospectionFragmentMatcher({
introspectionQueryResultData
});
this.apolloClient = new ApolloClient({
link: ApolloLink.from([
onError(({ graphQLErrors, networkError }:any) => {
if (graphQLErrors) {
graphQLErrors.map((el:any) =>
console.warn(
el.message || el
)
)
graphQLErrors.map(({ message, locations, path }:any) =>
console.warn(
`[GraphQL error - Env ${api.environmentId}]: Message: ${message}, Location: ${JSON.stringify(locations)}, Path: ${path}`
)
)
}
if (networkError) console.log(`[Network error]: ${networkError}`)
}),
new HttpLink({
uri,
credentials: 'same-origin',
headers: {
Authorization: `Bearer ${ACCESS_TOKEN}`
},
fetch
})
]),
cache: new InMemoryCache({ fragmentMatcher }),
// fetchPolicy as network-only avoids using the cache.
defaultOptions: {
watchQuery: {
fetchPolicy: 'network-only',
errorPolicy: 'ignore',
},
query: {
fetchPolicy: 'network-only',
errorPolicy: 'all',
},
}
});
}
}
After this constructor I run queries like :
let response = await this.apolloClient.query({ query: gql`${query}` });
As you might have noticed:
I needed to inject fetch on Httplink
I had to setup Authorization headers to access external provider graphQL endpoint
I used IntrospectionFragmentMatcher in order to use Fragments in my queries, along with building schema type ("fragmentTypes.json" with an init script)
Posting this to just add my experience and maybe more info for the question.
Also looking forward for comments and points of improvement for this wrapper.
Issue
I'm looking for a way to bring up a NestJS application with mocked providers. This is necessary for provider contract tests because a service needs to be brought up in isolation. Using the Pact library, testing the provider assumes that the provider service is already running. It needs to be able to make HTTP requests against the actual server (with some dependencies mocked if necessary). PactJS
Current Research
I've looked into the docs for NestJS and the closest solution I can find is pasted below. From what I can tell, this solution tells the module to replace any provider called CatsService with catsService. This theoretically would work for provider contract testing purposes, but I don't think this allows for the entire app to be brought up, just a module. There is no mention in the docs for being able to bring up the app on a specific port using the testing module. I've tried to call app.listen on the returned app object and it fails to hit a breakpoint placed right after the call.
import * as request from "supertest";
import { Test } from "#nestjs/testing";
import { CatsModule } from "../../src/cats/cats.module";
import { CatsService } from "../../src/cats/cats.service";
import { INestApplication } from "#nestjs/common";
describe("Cats", () => {
let app: INestApplication;
let catsService = { findAll: () => ["test"] };
beforeAll(async () => {
const module = await Test.createTestingModule({
imports: [CatsModule]
})
.overrideProvider(CatsService)
.useValue(catsService)
.compile();
app = module.createNestApplication();
await app.init();
});
it(`/GET cats`, () => {
return request(app.getHttpServer())
.get("/cats")
.expect(200)
.expect({
data: catsService.findAll()
});
});
afterAll(async () => {
await app.close();
});
});
Java Example
Using Spring a configuration class, mocks can be injected into the app when running with the "contract-test" profile.
#Profile({"contract-test"})
#Configuration
public class ContractTestConfig {
#Bean
#Primary
public SomeRepository getSomeRepository() {
return mock(SomeRepository.class);
}
#Bean
#Primary
public SomeService getSomeService() {
return mock(SomeService.class);
}
}
Update
Since version 4.4 you can also use listen since it now also returns a Promise.
You have to use the method listenAsync instead of listen so that you can use it with await:
beforeAll(async () => {
const moduleFixture = await Test.createTestingModule({
imports: [AppModule],
})
.overrideProvider(AppService).useValue({ root: () => 'Hello Test!' })
.compile();
app = moduleFixture.createNestApplication();
await app.init();
await app.listenAsync(3000);
^^^^^^^^^^^^^^^^^^^^^
});
Then you can make actual http requests instead of relying on supertest. (I am using the nodejs standard http library in this example.)
import * as http from 'http';
// ...
it('/GET /', done => {
http.get('http://localhost:3000/root', res => {
let data = '';
res.on('data', chunk => data = data + chunk);
res.on('end', () => {
expect(data).toEqual('Hello Test!');
expect(res.statusCode).toBe(200);
done();
});
});
});
Don't forget to close the application or otherwise your test will run until closed manually.
afterAll(() => app.close());
What is the easiest way to mock the response returned by Http get() in Angular 2?
I have local data.json file in my working directory, and I want get() to return response containing that data as a payload, simulating the rest api.
Documents for configuring the Backend object for Http seemed somewhat obscure and overcomplicated for such a simple task.
You need to override the XhrBackend provider with the MockBackend one. You need then to create another injector to be able to execute a true HTTP request.
Here is a sample:
beforeEachProviders(() => {
return [
HTTP_PROVIDERS,
provide(XHRBackend, { useClass: MockBackend }),
SomeHttpService
];
});
it('Should something', inject([XHRBackend, SomeHttpService], (mockBackend, httpService) => {
mockBackend.connections.subscribe(
(connection: MockConnection) => {
var injector = ReflectiveInjector.resolveAndCreate([
HTTP_PROVIDERS
]);
var http = injector.get(Http);
http.get('data.json').map(res => res.json()).subscribe(data) => {
connection.mockRespond(new Response(
new ResponseOptions({
body: data
})));
});
});
}));
By the way, you need to mock the XHRBackend and provide mocked data in a class with the createDb method. createDb method returns the mocked JSON object. To load that data provide correct URL to http.get, for example, if JSON object is contained in a variable mockedObject, then the URL should be "app\mockedObject".
You can read more details here: https://angular.io/docs/ts/latest/guide/server-communication.html.
You can use the HttpTestingController available via the core TestBed as to me it feels more intuitive (each to their own, of course). Untested snippet:
import { TestBed, async } from '#angular/core/testing';
import { HttpTestingController } from '#angular/common/http/testing';
import { MyApiService } from './my-api.service';
export function main() {
describe('Test set', () => {
let httpMock: HttpTestingController;
beforeEach(() => {
TestBed.configureTestingModule({
imports: [],
providers: [MyApiService]
});
httpMock = TestBed.get(HttpTestingController);
});
it('should call get', async(() => {
const data: any = {mydata: 'test'};
let actualResponse: any = null;
MyApiService.get().subscribe((response: any) => {
actualResponse = response;
});
httpMock.expectOne('localhost:5555/my-path').flush(data);
expect(actualResponse).toEqual(data);
}));
});
}