I am trying to learn the way API's work. Here I am trying to get the POST method to work. I am using this code to make the document in the database,
app.post('/add', async (req, res) => {
try {
const data = require('./test.json');
const newItemId = Math.floor(Math.random() * 1000 + 10).toString();
data.id = newItemId;
data.Partnership_Id = newItemId;
//for testing purpose only
let documentDefinition = {
"id": newItemId,
"name": "Angus MacGyver",
"state": "Building stuff"
};
// Open a reference to the database
const dbResponse = await cosmosClient.databases.createIfNotExists({
id: databaseId
});
let database = dbResponse.database;
const { container } = await database.containers.createIfNotExists({id: containerId});
// Add a new item to the container
console.log("** Create item **");
const createResponse = await container.items.create(data);
res.redirect('/');
} catch (error) {
console.log(error);
res.status(500).send("Error with database query: " + error.body);
}
})
Here I am using test.json for the data input. I am making a fake id using newItemId for data.id and data.Partnership_Id.
With this approach, I can make a document in the database and can check on Postman too but there is nothing in the Body tag in Postman.
I am confused on this part, I feel like the data for the new document should be passed through the Postman Body rather than me using newItemId for it.
This might be a silly question to ask but I am trying to get my head around how API works and how to pass data in them.
IDs are almost always auto generated on the backend (or at least should be) when creating a database resource, so what you have seems to be correct. I would recommend using a library like nanoid to generate the ids though, just to remove the potential for errors.
Its is RESTful convention to return created data, so in this case you would return a JSON on the created document, and then redirect etc on the front end (to ensure complete separation of backend front end - so you can say host them separately). Your approach is also fine and works though.
My advice is to think of your backend and frontend as been completely separate, I would have a project for each personally. This was it is more clear how everything links together.
Related
The request.files method is great. I can map through multipart files and push them to s3 with the moveToDisk method. However, it seems that when moveToDisk returns an error or anything else it interrupts the loop. I'm pretty sure I have things at least close to set up correctly. I can successfully upload the first file and even store information to my database. But Anything beyond one file doesn't work. Anyone else having trouble with files or moveToDisk
const sprites = request.files("sprites");
for (let sprite of sprites) {
const collection = request.all().collection;
const uuid = uuidv4();
await client
.db("pixel-shop")
.collection("sprites")
.insertOne({
collection: collection,
token: sprite.fileName,
path: `pixel-shop/${collection}/${uuid}`,
created_at: new Date(),
});
await sprite?.moveToDisk(
`pixel-shop/sprites/${request.all().collection}`,
{ name: uuid }
);
}
Some things I've noticed in the logs are a cannot write file error which is weird because the first file writes fine, something about a .getUrl method, as well as some stuff about SSL but all my stuff aside from my localhost are secure.
I am currently working on adding Auth0 to a Vue.js/Node.js application and so far I have figured out how to allow users to register and log in (to /callback) and that seems to be working fine. However, I have manually added (will be automatic later on) some data to the user metadata section. I have the below code as a rule that is turned on. I can’t seem to get access to the data on the Vue.js end of things. What I’d like is to be able to get the user data and user metadata so I can store it in my front end.
Rule code
function (user, context, callback) {
const namespace = 'account_signup_type/';
const namespace2 = 'account_type';
context.idToken[namespace + 'is_new'] = (context.stats.loginsCount === 1);
context.idToken[namespace2] = user.account_type;
context.idToken.user = user;
callback(null, user, context);
}
Code I am trying in my Vue.js front end
getIdTokenClaims(o) {
return this.auth0Client.getIdTokenClaims(o);
}
Currently, this returns undefined
I ended up figuring it out, there was no namespace being created in the id_token which resulted in it not properly passing the data through to the Vue .js app. I added a namespace using a web address format with the domain extension cut off and it now works.
I'm trying to start writing tests for my application. I'm using Jest & Supertest to run all of my tests. When I try and run my test suite, I'm getting an error regarding a foreign key constraint.
The error:
error: truncate "users" restart identity - cannot truncate a table referenced in a foreign key constrainterror: cannot truncate a table referenced in a foreign key constraint
This is my server.spec.js file:
const request = require('supertest');
const server = require('./server.js');
const db = require('../data/db-config.js');
describe('server.js', () => {
describe('POST /register', () => {
it('should return 201 created', async () => {
const user =
{
name: "test",
username: "test",
email: "test77#test.com",
password: "password"
}
const res = await request(server).post('/api/auth/register').send(user);
expect(res.status).toBe(201);
})
beforeEach(async () => {
await db("graphs").truncate();
await db("users").truncate();
});
})
})
And here is my knex migration file:
exports.up = function(knex) {
return (
knex.schema
.createTable('users', tbl => {
tbl.increments();
tbl.string('username', 255).notNullable();
tbl.string('password', 255).notNullable();
tbl.string('name', 255).notNullable();
tbl.string('email', 255).unique().notNullable();
})
.createTable('graphs', tbl => {
tbl.increments();
tbl.string('graph_name', 255).notNullable();
tbl.specificType('dataset', 'integer[]').notNullable();
tbl
.integer('user_id')
.unsigned()
.notNullable()
.references('id')
.inTable('users')
.onDelete('CASCADE')
.onUpdate('CASCADE');
})
)
};
exports.down = function(knex) {
return (
knex.schema
.dropTableIfExists('graphs')
.dropTableIfExists('users')
)
};
I came across this answer in my research: How to test tables linked with foreign keys?
I'm new to both Postgres as well as testing. It makes sense that I would need to drop the tables in the reverse order like I have in my migration. But when I try to truncate them in the beforeEach section of my test, it doesn't seem to matter what order the tables are listed in.
I'm not sure where exactly to go from here. Any help would be greatly appreciated.
I think the trick here will be to resort to a bit of knex.raw:
await db.raw('TRUNCATE graphs, users RESTART IDENTITY CASCADE');
CASCADE because you don't want foreign key constraints getting in the way, and RESTART IDENTITY because the default Postgres behaviour is not to reset sequences. See TRUNCATE.
While we're on a related subject, allow me to introduce something that might make your life a lot easier: template databases! Templates are databases that Postgres can use to very rapidly recreate a database from a known state (by copying it). This can be faster than even truncating tables, and it allows us to skip all the annoying foreign key stuff when testing.
For example, you could use raw to do the following:
DROP DATABASE testdb;
CREATE DATABASE testdb TEMPLATE testdb_template;
This is a surprisingly inexpensive operation, and is great for testing because you can begin with a known state (not necessarily an empty one) each time you do a test run. I guess the caveats are that your knexfile.js will need to specify a connection with a user sufficiently credentialled to create and delete databases (so maybe an 'admin' connection that knows about localhost only) and that the template must be created and maintained. See Template Databases for more.
I have an app that uses firebase, the whole stack pretty much, functions, database, storage, auth, messaging, the whole 9. I want to keep the client end very lightweight. So if a user comments on a post and "tags" another user, let's say using the typical "#username" style tagging, I moved all of the heavy lifting to the firebase functions. That way the client doesn't have to figure out the user ID based on the username, and do everything else. It is setup using triggers, so when the above scenario happens I write to a "table" called "create_notifications" with some data like
{
type: "comment",
post_id: postID,
from: user.getUid(),
comment_id: newCommentKey,
to: taggedUser
}
Where the taggedUser is the username, the postID is the active post, the newCommentKey is retrieved from .push() on the comments db reference, and the user.getUid() is from the firebase auth class.
Now in my firebase functions I have a "onWrite" trigger for that specific table that gets all of the relevant information and sends out a notification to the poster of the post with all the relevant details. All of that is complete, what I am trying to figure out is... how do I delete the incoming event, that way I don't need any sort of cron jobs to clear out this table. I can just grab the event, do my needed calculations and data gathering, send the message, then delete the incoming event so it never even really exists in the database except for the small amount of time it took to gather the data.
A simplified sample of the firebase functions trigger is...
exports.createNotification = functions.database.ref("/create_notifications/{notification_id}").onWrite(event => {
const from = event.data.val().from;
const toName = event.data.val().to;
const notificationType = event.data.val().type;
const post_id = event.data.val().post_id;
var comment_id, commentReference;
if(notificationType == "comment") {
comment_id = event.data.val().comment_id;
}
const toUser = admin.database().ref(`users`).orderByChild("username").equalTo(toName).once('value');
const fromUser = admin.database().ref(`/users/${from}`).once('value');
const referencePost = admin.database().ref(`posts/${post_id}`).once('value');
return Promise.all([toUser, fromUser, referencePost]).then(results => {
const toUserRef = results[0];
const fromUserRef = results[1];
const postRef = results[2];
var newNotification = {
type: notificationType,
post_id: post_id,
from: from,
sent: false,
create_on: Date.now()
}
if(notificationType == "comment") {
newNotification.comment_id = comment_id;
}
return admin.database().ref(`/user_notifications/${toUserRef.key}`).push().set(newNotification).then(() => {
//NEED TO DELETE THE INCOMING "event" HERE TO KEEP DB CLEAN
});
})
}
So in that function in the final "return" of it, after it writes the finalized data to the "/user_notifications" table, I need to delete the event that started the whole thing. Does anyone know how to do that? Thank you.
First off, use .onCreate instead of .onWrite. You only need to read each child when they are first written, so this will avoid undesirable side effects. See the documentation here for more information on the available triggers.
event.data.ref() holds the reference where the event occurred. You can call remove() on the reference to delete it:
return event.data.ref().remove()
The simplest way to achieve this is through calling the remove() function offered by the admin sdk,
you could get the reference to the notification_id through the event, i.e event.params.notification_id then remove it when need be with admin.database().ref('pass in the path').remove(); and you are good to go.
For newer versions of Firebase, use:
return change.after.ref.remove()
How simultaneously to render a page and transmit my custom data to browser. As i understood it needs to send two layers: first with template and second with JSON data. I want to handle this data by backbone.
As i understood from tutorials express and bb app interact as follows:
res.render send a page to browser
when document.ready trigger jQuery.get to app.get('/post')
app.get('/post', post.allPosts) send data to page
This is three steps and how to do it by one?
var visitCard = {
name: 'John Smit',
phone: '+78503569987'
};
exports.index = function(req, res, next){
res.render('index');
res.send({data: visitCard});
};
And how i should catch this variable on the page- document.card?
I created my own little middleware function that adds a helper method called renderWithData to the res object.
app.use(function (req, res, next) {
res.renderWithData = function (view, model, data) {
res.render(view, model, function (err, viewString) {
data.view = viewString;
res.json(data);
});
};
next();
});
It takes in the view name, the model for the view, and the custom data you want to send to the browser. It calls res.render but passes in a callback function. This instructs express to pass the compiled view markup to the callback as a string instead of immediately piping it into the response. Once I have the view string I add it onto the data object as data.view. Then I use res.json to send the data object to the browser complete with the compiled view :)
Edit:
One caveat with the above is that the request needs to be made with javascript so it can't be a full page request. You need an initial request to pull down the main page which contains the javascript that will make the ajax request.
This is great for situations where you're trying to change the browser URL and title when the user navigates to a new page via AJAX. You can send the new page's partial view back to the browser along with some data for the page title. Then your client-side script can put the partial view where it belongs on the page, update the page title bar, and update the URL if needed as well.
If you are wanting to send a fully complete HTML document to the browser along with some initial JavaScript data then you need to compile that JavaScript code into the view itself. It's definitely possible to do that but I've never found a way that doesn't involve some string magic.
For example:
// controller.js
var someData = { message: 'hi' };
res.render('someView', { data: JSON.stringify(someData) });
// someView.jade
script.
var someData = !{data};
Note: !{data} is used instead of #{data} because jade escapes HTML by default which would turn all the quotation marks into " placeholders.
It looks REALLY strange at first but it works. Basically you're taking a JS object on the server, turning it into a string, rendering that string into the compiled view and then sending it to the browser. When the document finally reaches the browser it should look like this:
// someSite.com/someView
<script type="text/javascript">
var someData = { "message": "hi" };
</script>
Hopefully that makes sense. If I was to re-create my original helper method to ease the pain of this second scenario then it would look something like this:
app.use(function (req, res, next) {
res.renderWithData = function (view, model, data) {
model.data = JSON.stringify(data);
res.render(view, model);
};
next();
});
All this one does is take your custom data object, stringifies it for you, adds it to the model for the view, then renders the view as normal. Now you can call res.renderWithData('someView', {}, { message: 'hi' });; you just have to make sure somewhere in your view you grab that data string and render it into a variable assignment statement.
html
head
title Some Page
script.
var data = !{data};
Not gonna lie, this whole thing feels kind of gross but if it saves you an extra trip to the server and that's what you're after then that's how you'll need to do it. Maybe someone can think of something a little more clever but I just don't see how else you'll get data to already be present in a full HTML document that is being rendered for the first time.
Edit2:
Here is a working example: https://c9.io/chevex/test
You need to have a (free) Cloud9 account in order to run the project. Sign in, open app.js, and click the green run button at the top.
My approach is to send a cookie with the information, and then use it from the client.
server.js
const visitCard = {
name: 'John Smit',
phone: '+78503569987'
};
router.get('/route', (req, res) => {
res.cookie('data', JSON.stringify(pollsObj));
res.render('index');
});
client.js
const getCookie = (name) => {
const value = "; " + document.cookie;
const parts = value.split("; " + name + "=");
if (parts.length === 2) return parts.pop().split(";").shift();
};
const deleteCookie = (name) => {
document.cookie = name + '=; max-age=0;';
};
const parseObjectFromCookie = (cookie) => {
const decodedCookie = decodeURIComponent(cookie);
return JSON.parse(decodedCookie);
};
window.onload = () => {
let dataCookie = getCookie('data');
deleteCookie('data');
if (dataCookie) {
const data = parseObjectFromCookie(dataCookie);
// work with data. `data` is equal to `visitCard` from the server
} else {
// handle data not found
}
Walkthrough
From the server, you send the cookie before rendering the page, so the cookie is available when the page is loaded.
Then, from the client, you get the cookie with the solution I found here and delete it. The content of the cookie is stored in our constant. If the cookie exists, you parse it as an object and use it. Note that inside the parseObjectFromCookie you first have to decode the content, and then parse the JSON to an object.
Notes:
If you're getting the data asynchronously, be careful to send the cookie before rendering. Otherwise, you will get an error because the res.render() ends the response. If the data fetching takes too long, you may use another solution that doesn't hold the rendering that long. An alternative could be to open a socket from the client and send the information that you were holding in the server. See here for that approach.
Probably data is not the best name for a cookie, as you could overwrite something. Use something more meaningful to your purpose.
I didn't find this solution anywhere else. I don't know if using cookies is not recommended for some reason I'm not aware of. I just thought it could work and checked it did, but I haven't used this in production.
Use res.send instead of res.render. It accepts raw data in any form: a string, an array, a plain old object, etc. If it's an object or array of objects, it will serialize it to JSON for you.
var visitCard = {
name: 'John Smit',
phone: '+78503569987'
};
exports.index = function(req, res, next){
res.send(visitCard};
};
Check out Steamer, a tiny module made for this this exact purpose.
https://github.com/rotundasoftware/steamer
Most elegant and simple way of doing this is by using rendering engine (at least for that page of concern). For example use ejs engine
node install ejs -s
On server.js:
let ejs = require('ejs');
app.set('view engine', 'ejs');
then rename desired index.html page into index.ejs and move it to the /views directory. After that you may make API endpoit for that page (by using mysql module):
app.get('/index/:id', function(req, res) {
db.query("SELECT * FROM products WHERE id = ?", [req.params.id], (error, results) => {
if (error) throw error;
res.render('index', { title: results[0] });
});
});
On the front-end you will need to make a GET request, for example with Axios or directly by clicking a link in template index.ejs page that is sending request:
<a v-bind:href="'/index/' + co.id">Click</a>
where co.id is Vue data parameter value 'co' that you want to send along with request