How to set a key in Redis and get the value (I'm building a url shortener) - javascript

I'm kind of new to Redis and I'm currently experiencing a project stand-still because I don't know any other way to set and get in Redis.
My problem is I'm building a url shortener and when the user posts (a POST request) a url to the server, I'm setting the url as the key and a nanoid generated code as the value and sending back the nanoid code to the user. But when the user sends a GET request with the url code to the server I have to check if the url is already cached and redirect the user to the url but I can't because the actual url as been set as the key not the url code so it will always return undefined. Please can you help me with this problem? Is there some other to do this? Many thanks in advance! Here is the code:
import redis from 'redis';
import http from 'http';
import express from 'express';
import { Router } from 'express';
import { promisify } from 'util';
import { nanoid } from 'nanoid';
interface Handler {
(req: Request, res: Response, next: NextFunction): Promise<void> | void;
}
interface Route {
path: string;
method: string;
handler: Handler | Handler[];
}
const { PORT = 8080} = process.env;
// I'm using a docker container
const { REDIS_URL = 'redis://cache:6379' } = process.env;
const redisClient = redis.createClient({
url: REDIS_URL
});
const initCache = async () =>
new Promise((resolve, reject) => {
redisClient.on('connect', () => {
console.log('Redis client connected');
resolve(redisClient);
});
redisClient.on('error', error => reject(error));
});
async function getShortenedURL(url: string) {
const urlCode = nanoid(7);
redisClient.setex(url, 3600, urlCode);
return urlCode;
}
const getAsync = promisify(redisClient.get).bind(redisClient);
async function getFromCache(key: string) {
const data = await getAsync(key);
return data;
}
const routes = [
{
path: '/:url',
method: 'get',
handler: [
async ({ params }: Request, res: Response, next: NextFunction) => {
try {
const { url } = params;
const result = await getFromCache(url);
if (result) {
res.redirect(301, result);
} else {
throw new Error('Invalid url');
}
} catch (error) {
console.error(error);
}
}
]
},
{
path: '/api/url',
method: 'post',
handler: [
async ({ body }: Request, res: Response, next: NextFunction) => {
const { url } = body;
const result = await getFromCache(url);
result ? res.status(200).send(`http://localhost:${PORT}/${result}`) : next();
},
async ({ body }: Request, res: Response) => {
const result = await getShortenedURL(body.url as string);
res.status(200).send(result);
}
]
}
];
const applyRoutes = (routes: Route[], router: Router) => {
for (const route of routes) {
const { method, path, handler } = route;
(router as any)[method](path, handler);
}
};
const router = express();
applyRoutes(routes, router);
const server = http.createServer(router);
async function start() {
await initCache();
server.listen(PORT, () => {
console.log(`Server is running on http://localhost:${PORT}...`)
}
);
}
start();

As I understand, you need to make sure that you do not shorten and store any given url twice.
You could encode the url and use it as the sort version and as a key at the same time. E.g.
www.someurltoshorten.com -> encoded value ->
{key: value} -> encoded value: www.someurltoshorten.com
If a user wants to shorten a url, you encode it first and you should get the exact same hash for the exact same url.
Once you get the encoded value, you can use the SET command with a "GET" option. You can also use the expire (EXAT) option to clean up old urls (those that nobody is looking for anymore) using the feature that is built in Redis.
It will do the following for you:
Set key to hold the string value (the key is the short version of the url and the value is the url itself)
If the value exists, it will overwrite it and reset (extend) the TTL (time to live) if you set it.
And the "GET" option will return the old value if it exists or null.
With one command you will be able to:
Create a value in Redis
Get the value if it already exists resetting the TTL (it makes sense to extend it) and all of the without any extra code with one command only!!!
The flow may look as follows:
A user inputs a url to be shortened:
you encode the url
you store it in Redis using the SET command where the key is the encoded value and the value is the url.
you return the encoded value which you already now. There is no need to check whether the url has already been shortened once because the SET command will either create a new entry or update the existing once.
A user inputs a shortened url
you encode the url
you store it in Redis using the SET command where the key is the encoded value and the value is the url.
you get the url from the value that was returned by the SET command thank to the "GET" option.
The only difference between the two cases is in whether you return the shortened url or the normal url
Basically, you need one Redis command for all of that to work.
I did not test the encoding/hashing of the url and it may not work with all types of url. You need to check which encoding would cover all cases.
But the idea here is the concept itself. It's similar to how we handle passwords. When you register, it's hashed. Then, when you log in and provide the same password, we can hash it again and compare hashes. Secure hashing with bycript, as an example, can be expensive (can take a lot of time).
For urls you need to make sure that encoding/hashing always produces the same result for the same url.
Keep in mind the length of the keys as describe here https://redis.io/topics/data-types-intro#redis-keys

you should use the HashCode generate for the URL as the Key for your dictionary since you intend to lookup by the shortened URL later.
Post--> Hash the URL, Encode it as per your need for length restrictions return the shortened Key as shortened URL and put <Hash,URL> in your map
Get--> User gives the shortened Key, Dictionary lookup for shortened Key and return the actual URL.

Related

Stripe Payment Success URL Redirect With Parameter

I'm working with the following Stripe.js file in a Next.js project:
import { loadStripe } from "#stripe/stripe-js";
export async function Stripe({ lineItems }, imageUrls) {
let stripePromise = null;
const getStripe = () => {
if (!stripePromise) {
stripePromise = loadStripe(process.env.NEXT_PUBLIC_API_KEY);
}
return stripePromise;
};
const stripe = await getStripe();
await stripe.redirectToCheckout({
mode: "payment",
lineItems,
successUrl: `http://localhost:3000/success?pdf=${imageUrls}`,
cancelUrl: window.location.origin,
});
}
When I call the Stripe function, I'm passing an imageUrls array which looks like this for example:
['blob:http://localhost:3000/2a47a926-be04-49a9-ad96-3279c540ebb4']
When the Stripe redirectToCheckout happens, I navigate to the success page and pass imageUrls.
My goal is to convert these imageUrls into png images from the success page using code like this inside of an async function:
const fileResponse = await fetch(imageUrl);
const contentType = fileResponse.headers.get("content-type");
const blob = await fileResponse.blob();
const imageFile = new File([blob], `someImage.png`, {
contentType,
});
I end up getting this error though:
GET blob:http://localhost:3000/2a47a926-be04-49a9-ad96-3279c540ebb4 net::ERR_FILE_NOT_FOUND
I'm guessing after the redirect this URL doesn't exist anymore? What is the correct way to make something like this work?
Edit to include success.js code:
import Layout from "../components/Layout/Layout";
import { useEffect } from "react";
function SuccessPage() {
useEffect(() => {
const params = new Proxy(new URLSearchParams(window.location.search), {
get: (searchParams, prop) => searchParams.get(prop),
});
let value = params.pdf;
console.log("this is value");
console.log(value);
async function getFileFromUrl(imageUrl) {
const fileResponse = await fetch(imageUrl);
const contentType = fileResponse.headers.get("content-type");
const blob = await fileResponse.blob();
const ditheredImageFile = new File([blob], `test.png`, {
contentType,
});
return ditheredImageFile;
}
let imageFile = getFileFromUrl(value);
console.log("this is imageFile");
console.log(imageFile);
}, []);
return (
<Layout>
<h3>Thank You For Your Order!</h3>
</Layout>
);
}
export default SuccessPage;
You're using blobs and you shouldn't - and I'm guessing those blobs came from a user input.
Blobs in javascript work in-memory (RAM), they will be discarded when your document unloads.
Since you're redirect the user to another page (stripe) you're unloading your document and thus loosing everything you've in memory (all your blobs are gone, they are only good while your document is loaded, after you leave it/get redirected they are cleared from memory).
To solve your problem you must simply upload the documents to a server prior to unloading the document (redirecting the user to stripe) and pass your server URL instead of your "internal" (blob) URL and all should work.
Basically, you need to save your files on a server via AJAX, have the server save the files and return their URL's (or even better, an ID for your image collection) and use those server URL's on the redirect (or an ID that you'll use to retrieve all the files you need later, simplifying your parameter usage).
More info at: https://javascript.info/blob ("Blob as URL" section)
The localhost url will not work because Stripe has no way of accessing an image on your local machine. Instead, you should upload the image to a database and provide a public URL. I'm not sure what systems (e.g. AWS, Azure) you use, so it's hard to get very specific.
The database can then be queried after landing on the checkout page. One way to do this would be to pass the ID of the item as a URL param. Another way is to store information about the product in local storage.
Either way, you should get a response from your database with a unique link or ID so you know exactly what to query later.
you should pass the image data as a base64 encoded string to the success URL and then decode it in the success page to get the actual image data.
modify your code:
On the Stripe function:
const imageUrls = lineItems.map((item) => {
return btoa(item.url);
});
// ...
successUrl: `http://localhost:3000/success?pdf=${imageUrls}`,
// ...
On the SuccessPage
async function getFileFromUrl(imageUrl) {
const decodedUrl = atob(imageUrl);
const fileResponse = await fetch(decodedUrl);
const contentType = fileResponse.headers.get("content-type");
const blob = await fileResponse.blob();
const ditheredImageFile = new File([blob], `test.png`, {
contentType,
});
return ditheredImageFile;
}
I think error is coming from this one
blob url will be only available in the same origin context, but your successurl is not the same as like blob
please try to generate dataurls from blob.
const dataUrls = await Promise.all(
imageUrls.map(async (url) => {
const response = await fetch(url);
const blob = await response.blob();
return URL.createObjectURL(blob);
})
);
await stripe.redirectToCheckout({
mode: "payment",
lineItems,
successUrl: `http://localhost:3000/success?pdf=${encodeURIComponent(dataUrls.join(","))}`,
cancelUrl: window.location.origin,
});
And then please decode urls at the success page, then you could resolve your issues.
The correct way to make this work is to use a URL query parameter instead of a URL path parameter. When you pass parameters in the URL path, they are not accessible after the redirect.
So instead, you should pass the imageUrls array as a query parameter in the successUrl like this:
successUrl: http://localhost:3000/success?pdf=${imageUrls.join(',')},
You can then access the query parameter in the success page and convert the imageUrls array into png images.

agora start method error : post method api body check failed

I'm building a video-calling app using Next js and agora.io 4, I followed the steps mentioned in the Docs.
I enabled agora cloud recording
called the acquire method and got the resourceId.
Then, I called the start method. but it always failed with an error post method API body check failed!
However, it works perfectly on Postman.
Here's the code :
import axios from "axios";
import chalk from "chalk";
// AWS S3 storage bucket credentials
const secretKey = process.env.S3_SECRET_KEY;
const accessKey = process.env.S3_ACCESS_KEY;
const bucket = process.env.S3_BUCKET_NAME;
const region = process.env.S3_BUCKET_REGION;
const vendor = process.env.S3_VENDOR;
//agora credentials
const appId = process.env.APP_ID;
const key = process.env.KEY;
const secret = process.env.SECRET;
export default async function startHandler(req, res) {
//call agora start method
const { uid, cname, resourceId, token } = req.body;
const plainCredential = `${key}:${secret}`;
const encodedCredential = Buffer.from(plainCredential).toString("base64"); // Encode with base64
const authorizationField = `Basic ${encodedCredential}`;
const data = {
uid,
cname,
clientRequest: {
recordingConfig: {
streamMode: "standard",
channelType: 0,
subscribeUidGroup: 0,
},
storageConfig: {
accessKey,
region,
bucket,
secretKey,
vendor,
},
},
};
const headers = {
"Content-Type": "application/json",
Authorization: authorizationField,
};
const startUrl = `https://api.agora.io/v1/apps/${appId}/cloud_recording/resourceid/${resourceId}/mode/individual/start`;
try {
const response = await axios.post(startUrl, data, {
headers,
});
res.status(200).send(response.data);
} catch (error) {
console.error(error);
res.send(error);
}
}
Any help/hint would be much appreciated
I found the fix!
First, you may be tricked by the uid returned from the agora join method, it's returning a Number, surprisingly! the start method
expect the uid to be a string, so don't forget to do a
uid.toString().
In the storageConfig object, you should check the type of each of its attributes. each of region and vendor is expected to be of type Number. That said, if you're storing this info in a .env file, remember that environment files only stores strings. Therefore, you should convert them to Numbers!
This problem took me 2 days, so I hope this will be useful for you!

uploading multiple FormField objects containing image data arrays from angular to express

I am trying to upload two FormField objects along with form data to express.
The part im stuck at is using the multer library in express to extract this data from the request. I can access the form data but not the FormField objects.
in angular:
requestBooking(formFields, aPics:Array<File>,bPics:Array<File>): {
const aaPics = new FormData();
const bbPics = new FormData();
aPics.forEach(image => {
aaPics.append('aImgs', image);
});
referencePics.forEach(image => {
bbPics.append('bImgs', image);
})
// aaPics.forEach((value,key) => {
// console.log(key + ' ' + value);
// })
const payload = {
form: formFields,
aImgs: aaPics,
bImgs: bbPics
}
this.rApi.makePostRequest(
this.serverUrl + this.uris.requestBooking,
payload
).subscribe(
res => {
let response: apiResponse = {
type: 'Post',
origin: this.apiOrigins.requestBooking,
isError: false,
content: res
}
this.bookingUpdateResponse$.next(response);
}, err => {
this.bookingUpdateResponse$.next(err)
}
)
}
I have confirmed the FormField data is correctly appending to the FormField objects so i think its getting sent
in express:
routes/booking.js
const aUploads = multer();
const bUploads = multer();
const bookingRouter = express.Router();
bookingRouter.post('/request', aUploads.array('aImgs', 10), bUploads.array('bImgs',10), requestABooking);
controllers/bookings.js
export const requestABooking = async (req, res) => {
const PATH = './uploads';
const bookId = uuidv4();
const guestInfo = req.body.form.fmgroup1;
const tattooInfo = req.body.form.fmgroup2;
const bodyImgs = req.body.form.aImgs;
const tatImgs = req.body.form.bImgs;
//console.log( req.body);
// bodyImgs.forEach((value,key) => {
// console.log(key + ' ' + value);
// })
}
I am not able to see the FormField information at this point.
I am pretty sure im using multer wrong in the routes but this is not the first thing ive tried. Ideally id rather not add to the route this way but instead extract the info from the object in the body as i think the former would require me to write a specific path for the upload in the angular.
If there is a way to do this within the express controller that would be the best solution i think but if not a solution would be very welcome!
You can refer this post on url
https://javascript.plainenglish.io/uploading-files-using-multer-on-server-in-nodejs-and-expressjs-5f4e621ccc67
I just followed it yesterday to make it work
Seems your multer is not configured in the way it should be
BLUF: add all of your information into a single formData object.
After struggling with this for days I finally found a satisfactory answer!
The Problem: i wanted to pass info like this:
parentObject:{
formInfo:{json},
imageArr1:{Array<File>},
imageArr2:{Array<File>}
}
But to handle multipart form information i needed to use something like "multer" to handle the special data stuff involved with data buffers and other large file streaming content.
To use this you format it at the router level in the backend, which initially i did not want to do as this would make my process look something like this:
Client uploads form data -> server posts data returns key for object created -> client uploads image data w/ key for management -> server returns good response all is well. but if ever that failed there are a tooooon of error handling processes that need to be put in place so i realllllllly didnt want to break this up into a multiple service calling scenario.
But the simple solution looks like this:
Client
Object.keys(formFields).forEach(field => { formData.append(field, formFields[field])}.
fileArrAA.forEach(image => { formData.append('aaImages', image)}
fileArrBB.forEach(image => { formData.append('aaImages', image)}
this.apiSvc.post(url, formData).subscribe....
Server
const multer = require('multer');
const upload = multer();
router.post('/postThing', upload.any(), endpoint);
const endpoint = async (req, res) => {
req.files; // has all the File data types in it as an array
req.body; // has all the form fields or any key/value pairs in it
}
observation: i tried passing a nested object into the formData object, but i wasnt able to extract it in the backend so i ended up passing all the fields in the top layer of the formData object. probably missing a package for parsing api data or something.
Hope no one else has to go through this learning curve!

Cloudinary Signed Uploads with Widget

Documentation is extremely frustrating.
I'm using the upload widget to try to allow users to upload multiple pictures for their profile. I can't use unsigned uploads because of the potential for abuse.
I would much rather upload the file through the upload widget instead of through the server as it seems like it should be so simple
I've pieced together what I think should work but it is still saying: Upload preset must be whitelisted for unsigned uploads
Server:
// grab a current UNIX timestamp
const millisecondsToSeconds = 1000;
const timestamp = Math.round(Date.now() / millisecondsToSeconds);
// generate the signature using the current timestmap and any other desired Cloudinary params
const signature = cloudinaryV2.utils.api_sign_request({ timestamp }, CLOUDINARY_SECRET_KEY);
// craft a signature payload to send to the client (timestamp and signature required)
return signature;
also tried
return {
signature,
timestamp,
};
also tried
const signature = cloudinaryV2.utils.api_sign_request(
data.params_to_sign,
CLOUDINARY_SECRET_KEY,
);
Client:
const generateSignature = async (callback: Function, params_to_sign: object): Promise<void> => {
try {
const signature = await generateSignatureCF({ slug: 'xxxx' });
// also tried { slug: 'xxxx', params_to_sign }
callback(signature);
} catch (err) {
console.log(err);
}
};
cloudinary.openUploadWidget(
{
cloudName: 'xxx',
uploadPreset: 'xxxx',
sources: ['local', 'url', 'facebook', 'dropbox', 'google_photos'],
folder: 'xxxx',
apiKey: ENV.CLOUDINARY_PUBLIC_KEY,
uploadSignature: generateSignature,
},
function(error, result) {
console.log(error);
},
);
Let's all take a moment to point out how horrible Cloudinary's documentation is. It's easily the worst i've ever seen. Nightmare fuel.
Now that i've got that off my chest... I really needed to be able to do this and I spent way too long banging my head against walls for what should be extremely simple. Here it is...
Server (Node.js)
You'll need an endpoint that returns a signature-timestamp pair to the frontend:
import cloudinary from 'cloudinary'
export async function createImageUpload() {
const timestamp = new Date().getTime()
const signature = await cloudinary.utils.api_sign_request(
{
timestamp,
},
process.env.CLOUDINARY_SECRET
)
return { timestamp, signature }
}
Client (Browser)
The client makes a request to the server for a signature-timestamp pair and then uses that to upload a file. The file used in the example should come from an <input type='file'/> change event etc.
const CLOUD_NAME = process.env.CLOUDINARY_CLOUD_NAME
const API_KEY = process.env.CLOUDINARY_API_KEY
async function uploadImage(file) {
const { signature, timestamp } = await api.post('/image-upload')
const form = new FormData()
form.append('file', file)
const res = await fetch(
`https://api.cloudinary.com/v1_1/${CLOUD_NAME}/image/upload?api_key=${API_KEY}&timestamp=${timestamp}&signature=${signature}`,
{
method: 'POST',
body: form,
}
)
const data = await res.json()
return data.secure_url
}
That's it. That's all it takes. If only Cloudinary had this in their docs.
Man. I hate my life. I finally figured it out. It literally took me beautifying the upload widget js to understand that the return of the function should be a string instead of an object even though the docs make it seem otherwise.
Here is how to implement a signed upload with a Firebase Cloud Function
import * as functions from 'firebase-functions';
import cloudinary from 'cloudinary';
const CLOUDINARY_SECRET_KEY = functions.config().cloudinary.key;
const cloudinaryV2 = cloudinary.v2;
module.exports.main = functions.https.onCall(async (data, context: CallableContext) => {
// Checking that the user is authenticated.
if (!context.auth) {
// Throwing an HttpsError so that the client gets the error details.
throw new functions.https.HttpsError(
'failed-precondition',
'The function must be called while authenticated.',
);
}
try {
return cloudinaryV2.utils.api_sign_request(data.params_to_sign, CLOUDINARY_SECRET_KEY);
} catch (error) {
throw new functions.https.HttpsError('failed-precondition', error.message);
}
});
// CLIENT
const uploadWidget = () => {
const generateSignature = async (callback: Function, params_to_sign: object): Promise<void> => {
try {
const signature = await generateImageUploadSignatureCF({ params_to_sign });
callback(signature.data);
} catch (err) {
console.log(err);
}
};
cloudinary.openUploadWidget(
{
cloudName: 'xxxxxx',
uploadSignature: generateSignature,
apiKey: ENV.CLOUDINARY_PUBLIC_KEY,
},
function(error, result) {
console.log(error);
},
);
};

How to handle authorization token

I would like to add auth token to http request header every time a http request sent and if authorization fails, I want to redirect user to the login. Should I decorate Http Driver or is there a better way to do it?
I came with a solution that decorates http driver. But I'm not sure this is the correct way of doing it. Here's the code so far I have written:
import Rx from 'rx';
import {makeHTTPDriver} from '#cycle/http';
function makeSecureHTTPDriver({eager = false} = {eager: false}) {
return function secureHTTPDriver(request$) {
const httpDriver = makeHTTPDriver(eager);
const securedRequest$ = request$
.map(request => {
const token = localStorage.getItem('token');
if (token) {
request.headers = request.headers || {};
request.headers['X-AUTH-TOKEN'] = token;
}
return request;
});
const response$ = httpDriver(securedRequest$);
//todo: check response and if it fails, redirect to the login page
return response$;
}
}
export default makeSecureHTTPDriver;
Here is the code how I use makeSecureHttpDriver
const drivers = {
DOM: makeDOMDriver('#app'),
HTTP: makeSecureHttpDriver()
};
This is a little late, I don't frequent SO very much. I'd suggest using other drivers instead to avoid placing any logic in your drivers.
import storageDriver from '#cycle/storage'
import {makeHTTPDriver} from '#cycle/http'
function main(sources) {
const {storage, HTTP} = sources
const token$ = storage.local.getItem('token')
.startWith(null)
const request$ = createRequest$(sources)
const secureRequest$ = request$.withLatestFrom(token$,
(request, token) => token ?
Object.assign(request, {headers: {'X-AUTH-HEADER' : token }) :
request
)
return {HTTP: secureRequest$, ...}
}
Cycle.run(main, {
...
storage: storageDriver,
HTTP: makeHTTPDriver()
})
I'm not sure if this will help but HTTP driver is superagent under the hood so you can pass it an object like with required info like here.
But in regards to your issue I think that the HTTP driver might need this option added to the driver it self so you can dictate if the driver should be secure or not eg:
const drivers = {
DOM: makeDOMDriver('#app'),
HTTP: makeSecureHttpDriver({secure:true})
};
Because your implementation looks ok to me, it might be worth having it in the driver itself.
I'd create an issue in the HTTP driver repo and see what the community think, you can also ask people to interact via the gitter channel :-)

Categories

Resources