Related
i'm using next js 13 with firebase v9. and i'm using a drop zone to upload images. the dropzone returns an array with blob as it's src.
[
{
id: 1
name: "image_processing20220628-4591-yzir35.png"
src: "blob:http://localhost:3000/6e2f33e5-a749-4e9a-b502-d20b8e3f38ca"
}
...
]
the above array is returned from the drop zone. and when i tried to upload to firebase storage it throws an error .
FirebaseError: Firebase Storage: Object 'livingImages/blob:http:/localhost:3000/ca0e3eaf-dbe9-4d77-8053-f4b6d1bd8600' does not exist. (storage/object-not-found)
so how can i upload blob of images to firebase storage?
const imgURL = [];
//this is the images stored inside Redux
const images = useSelector(selectImages);
const storage = getStorage();
images.map(async (file) => {
const storageRef = ref(storage, `livingImages/${file.src}`);
await getDownloadURL(storageRef).then((url) => {
imgURL.push(url);
});
});
const createDocument = () => {
const docRef = doc(db, "livingPosts", session?.user?.email);
const colRef = collection(docRef, "posts");
addDoc(colRef, {
name: "test upload",
images: imgURL,
});
};
the dropzone code
const dispatch = useDispatch();
const images = useSelector(selectImages);
const [files, setFiles] = useState(images == [] ? [] : images);
const {getRootProps, getInputProps} = useDropzone({
onDrop: (acceptedFiles) => {
acceptedFiles.map((file, index) => {
const reader = new FileReader();
reader.onload = async function (e) {
const options = {
maxSizeMB: 5,
maxWidthOrHeight: 1920,
useWebWorker: true,
};
const compressedFile = await imageCompression(file, options);
const tot = parseInt(acceptedFiles.length) + parseInt(files.length);
if (tot > 9) {
alert("select maximum of 9 images");
} else if (parseInt(acceptedFiles.length) > 9) {
alert("maximum images to be selected is 9");
} else if (parseInt(files.length) < 9) {
setFiles((prevState) => [
...prevState,
{
id: index,
src: URL.createObjectURL(compressedFile),
name: file.name,
},
]);
files.map((filename) => {
acceptedFiles.forEach((newFile) => {
if (newFile.name == filename.name) {
alert("a duplicate image is detected");
setFiles(
files,
files.filter((val) => val !== newFile)
);
}
});
});
} else {
alert("something went wrong");
}
};
reader.readAsDataURL(file);
return file;
});
},
})
and the output of the dropzone is
As mentioned in the comments, you'll need the actual File or Blob object to upload the file and not the object URL. You can set the blob in state as shown below:
setFiles((prevState) => [
...prevState,
{
id: index,
src: URL.createObjectURL(compressedFile),
blob: compressedFile, // <-- add blob
name: file.name,
},
]);
Then to upload the files and storing download URLs in Firestore document, try the following function:
import { ref, uploadBytes, getDownloadURL } from "firebase/storage";
import { addDoc } from "firebase/firestore";
const uploadFiles = async () => {
console.log(files);
const promises = files.map((file) => {
const storageRef = ref(storage, `images/${file.name}`);
return uploadBytes(storageRef, file.blob);
});
// upload all files
const res = await Promise.all(promises);
// get download URLs
const links = await Promise.all(res.map((r) => getDownloadURL(r.ref)));
console.log({ links })
// Add Firestore document
const colRef = collection(db, "livingPosts", session?.user?.email, "posts")
const docRef = await addDoc(colRef, {
name: "test",
images: links,
});
console.log("Document written with ID: ", docRef.id);
};
You can call this function on a submit button click or any event when you want to start the upload.
This is my cache "component":
// imports
const useCache = (cacheName: string, url: string) => {
const cacheArray: Array<Object> = []
const getAllCaches = async () => {
const cacheNames = await caches.keys();
for (const cname of cacheNames) {
const cacheStorage = await caches.open(cname);
const cachedResponse = await cacheStorage.match(url);
const cdata = await cachedResponse?.json()
cacheArray.push({name: cname, data: cdata})
}
}
useEffect(() => {
getAllCaches()
.catch(err => console.log(err))
}, [])
const addCache = (response: any) => {
const data = new Response(JSON.stringify(response));
if ('caches' in window) {
caches.open(cacheName).then((cache) => {
cache.put(url, data);
});
}
const finalData = {name: cacheName, data: response}
cacheArray.push(finalData)
return data
}
const getCache = (cacheName?: string) => {
if (cacheName) {
return cacheArray.filter((i: any) => i.name === cacheName)[0]
}
else {
return cacheArray
}
}
const removeCache = (cacheName: string) => {
caches.delete(cacheName).then(function (res) {
return res;
});
}
return [
getCache as (cacheName?: any) => any,
addCache as (response: any) => any,
removeCache as (cacheName: any) => any
]
};
export default useCache;
Now here's code in my home component:
const [getCache, addCache, removeCache] = useCache("user", "http://localhost:3000")
useEffect(() => {
console.log(getCache())
console.log(getCache()[0])
console.log(getCache().length)
// the rest of code, not matter
and when I run home component (with vite and preact) it logging me Array, then unfedinfed, then 0 (but second should return object, and third should return 1) also I attached a screen from console.
Why it's returning me undefined and 0 length when it should return object and 1?
I'm using preact, vite, newest nodejs, typescript
I'm trying to add a image upload feature to react-draft-wysiwyg editor.
As per the editor documentation,
image: uploadCallback: This is image upload callBack. It should return a promise that resolves to give image src. Default value is true.
Both above options of uploadEnabled and uploadCallback should be present for upload to be enabled.
Promise should resolve to return an object { data: { link: <THE_URL>}}.
Source : https://jpuri.github.io/react-draft-wysiwyg/#/docs
The problem I'm facing is that, while uploading the image to firebase, and the resultant url, I'm trying to use it to create this object for returning from this callback.
Here, in code, firstly image is being uploaded using uploadBytes method of firebase, then using getDownloadURL, we are getting the url of the file. There is a bug that the object with undefined url link is being returned from the callback.
const uploadImageCallBack = (file) => {
let linkImg = "";
let arr = [];
const saveImagePromise = new Promise((resolve, reject) => {
const fileNameParts = file.name.split(".");
const storage = getStorage(app);
const storageRef = ref(
storage,
"Random/" +
title +
"/editorImage." +
uuidv4() +
fileNameParts[fileNameParts.length - 1]
);
debugger;
const metadata = {
contentType: file.type,
};
try {
const uploadTask = uploadBytes(storageRef, file, metadata);
debugger;
uploadTask.then((snapshot) => {
debugger;
const downloadURLPromise = getDownloadURL(storageRef);
downloadURLPromise.then((url) => {
linkImg = url;
debugger;
});
arr.push(downloadURLPromise);
});
arr.push(uploadTask);
} catch (error) {
console.log(error);
reject(error);
}
});
arr.push(uploadBytes, saveImagePromise);
console.log(Infinity);
Promise.all(arr).then((res) => {
console.log(res);
console.log(Infinity);
return new Promise((resolve, reject) => {
resolve({ data: { link: linkImg } });
});
});
};
and the code for editor is
<Editor
toolbar={{
inline: { inDropdown: true },
list: { inDropdown: true },
textAlign: { inDropdown: true },
link: { inDropdown: true },
history: { inDropdown: true },
image: {
urlEnabled: true,
uploadEnabled: true,
alignmentEnabled: true,
uploadCallback: uploadImageCallBack,
previewImage: true,
inputAccept:
"image/gif,image/jpeg,image/jpg,image/png,image/svg",
alt: { present: true, mandatory: false },
defaultSize: {
height: "auto",
width: "auto",
},
},
}}
onContentStateChange={(data) => {
let res = convertToPlain(draftToHtml(data));
console.log(data);
setReasonProgress(
remainigchar >= 100 ? 100 : remainigchar
);
}}
wrapperClassName="wrapper-class"
editorClassName="editor-class"
toolbarClassName="toolbar-class"
/>```
Please help me to create a correct return statement.
I solved this problem by using async function and adding wait statement for the url.
const uploadImageCallBack = async (file) => {
const fileNameParts = file.name.split(".");
const storage = getStorage(app);
const storageRef = ref(
storage,
uuidv4() +
fileNameParts[fileNameParts.length - 1]
);
let imageObject = {
file: file,
localSrc: URL.createObjectURL(file),
};
const metadata = {
contentType: file.type,
};
const snapshot = await uploadBytes(storageRef, file, metadata);
const url = await getDownloadURL(storageRef);
console.log(url, snapshot);
return new Promise((resolve, reject) => {
resolve({ data: { link: url } });
});
};
I am working on a Gatsby site that uses gatsby-source-wordpress. In my gatbsy-node.js file, I use the onCreateNote lifecycle method to determine if the node is a certain WordPress custom post type, then I reach out to a separate API to get related information for the post type, use createNodeField to add it as a field, and sometimes also use createRemoteFileNode to add images sourced from the API to a field on the new node.
Now this works great most of the time, but occasionally the createPages lifecycle method runs while the image/node code is still happening (I believe). This means that the image fields don't exist yet, and the page creation fails. Then after it fails, I see a console message in the log that I set up where it notifies me that the new field has successfully been added to the node.
How can I make sure that all of those nodes are finished and the data is complete, BEFORE the createPages lifecycle runs? It seems when the client uploads a larger image, this is more likely to fail... which makes sense if I'm understanding this correctly. Here is the code from my gatsby-node.js file:
const path = require(`path`);
const slash = require(`slash`);
const fetch = require('node-fetch');
const { createRemoteFileNode } = require(`gatsby-source-filesystem`)
exports.onCreateNode = ({ node, actions, store, cache,createNodeId, }) => {
const { createNode, createNodeField } = actions;
function getData(url) {
return new Promise((resolve, reject) => {
fetch(url)
.then((response) => response.json())
.then((data) => {
resolve(data);
});
})
}
if( node.internal.type === "wordpress__wp_location"){
const yextID = node.acf.yext_entity_id;
const yextOrthos = node.acf.location_orthodontists;
try {
const getLocation = async () => {
const data = await fetch("https://api.yext.com/v2/accounts/me/entities?api_key=" + process.env.YEXT_API_KEY + "&v=20191114&filter=%7B%22%24anywhere%22%3A%20%22" + yextID + "%22%7D&entityTypes=healthcareFacility")
.then(response => response.json());
// Transform the data into json
if( data && data.response && data.response.count === 1 ){
createNodeField({
node,
name: `yextLocation`,
value: data.response.entities[0]
});
} else {
console.log("NO LOCATIONS FOUND");
}
};
function getOrthos(){
let orthodontists = [];
yextOrthos.forEach( (ortho, i) => {
orthodontists.push(getData("https://api.yext.com/v2/accounts/me/entities?api_key=" + process.env.YEXT_API_KEY + "&v=20191114&filter=%7B%22%24anywhere%22%3A%20%22" + ortho.acf.yext_entity_ortho_id + "%22%7D&entityTypes=healthcareProfessional"));
});
Promise.all(orthodontists).then( (orthoData) => {
if( orthoData.length ){
let finalOrthos = [];
orthoData.forEach( (finalOrtho, x) => {
finalOrthos.push(finalOrtho.response.entities[0]);
});
createNodeField({
node,
name: `yextOrthos`,
value: finalOrthos
});
} else {
console.log("NO DOCTORS FOUND");
}
});
}
getLocation();
getOrthos();
} catch (error) {
console.log(error);
}
}
if( node.internal.type === "wordpress__wp_orthodontist"){
const yextID = node.acf.yext_entity_ortho_id;
const wpID = node.wordpress_id;
try {
const getTextOrtho = async () => {
const data = await fetch("https://api.yext.com/v2/accounts/me/entities?api_key=" + process.env.YEXT_API_KEY + "&v=20191114&filter=%7B%22%24anywhere%22%3A%20%22" + yextID + "%22%7D&entityTypes=healthcareProfessional")
.then(response => response.json());
// Transform the data into json
if( data && data.response && data.response.count === 1 ){
googleProfilePhoto = data.response.entities[0].googleProfilePhoto.url;
createNodeField({
node,
name: `yextOrthodontist`,
value: data.response.entities[0]
});
if( data.response.entities[0].googleProfilePhoto && data.response.entities[0].googleProfilePhoto.url){
createNodeField({
node,
name: `yextProfilePicture`,
value: data.response.entities[0].googleProfilePhoto.url
});
let fileNode = await createRemoteFileNode({
url: data.response.entities[0].googleProfilePhoto.url, // string that points to the URL of the image
parentNodeId: node.id, // id of the parent node of the fileNode you are going to create
createNode, // helper function in gatsby-node to generate the node
createNodeId, // helper function in gatsby-node to generate the node id
cache, // Gatsby's cache
store, // Gatsby's redux store
})
// if the file was created, attach the new node to the parent node
if (fileNode) {
console.log("GOOGLE PROFILE NODE CREATED!")
node.featuredImg___NODE = fileNode.id
} else {
console.log("ERROR! fileNode not Created!");
}
} else {
console.log("NO GOOGLE PROFILE PHOTO FOUND");
}
} else {
console.log("NO ORTHODONTISTS FOUND");
}
}
const getWpLocations = async () => {
const data = await fetch(process.env.GATSBY_WP_BASEURL+ "/wp-json/custom_endpoint/v1/locations_by_orthodontist?orthodontist_id=" + wpID).then(response => response.json());
if( data ){
createNodeField({
node,
name: `wpLocations`,
value: data
});
} else {
console.log("NO ORTHODONTISTS FOUND");
}
}
getTextOrtho();
getWpLocations();
} catch (error) {
console.log(error);
}
}
}
exports.createPages = async ({ graphql, actions }) => {
const { createPage } = actions;
const result = await graphql(`
{
locations: allWordpressWpLocation(filter: {status: {eq: "publish"}}) {
nodes {
id
path
acf {
location_orthodontists {
acf {
yext_entity_ortho_id
}
}
yext_entity_id
}
}
}
pages: allWordpressPage(
filter: {
wordpress_id: {nin: [177, 183, 8, 42, 44, 185, 46]}
status: {eq: "publish"}
}) {
nodes {
id
wordpress_id
path
}
}
orthodontists: allWordpressWpOrthodontist(filter: {status: {eq: "publish"}}) {
nodes {
id
path
}
}
posts: allWordpressPost(filter: {status: {eq: "publish"}}) {
nodes {
slug
id
}
}
}
`);
// Check for any errors
if (result.errors) {
throw new Error(result.errors);
}
const { locations, pages, orthodontists, posts } = result.data;
const locationTemplate = path.resolve(`./src/templates/location.js`);
const pageTemplate = path.resolve(`./src/templates/page.js`);
const orthoTemplate = path.resolve(`./src/templates/orthodontist.js`);
const postTemplate = path.resolve(`./src/templates/post.js`);
const blogTemplate = path.resolve(`./src/templates/blog.js`);
locations.nodes.forEach(node => {
let orthodontists = [];
node.acf.location_orthodontists.forEach(ortho => {
orthodontists.push(ortho.acf.yext_entity_ortho_id);
});
let orthodontistList = orthodontists.join();
createPage({
path: `${node.path}`,
component: slash(locationTemplate),
context: {
id: node.id,
yextId: node.acf.yext_entity_id,
yextOrthoIds: orthodontists
},
});
});
pages.nodes.forEach(node => {
createPage({
path: `${node.path}`,
component: slash(pageTemplate),
context: {
id: node.id,
},
});
});
orthodontists.nodes.forEach(node => {
createPage({
path: `${node.path}`,
component: slash(orthoTemplate),
context: {
id: node.id,
},
});
});
posts.nodes.forEach(node => {
createPage({
path: `${node.slug}`,
component: slash(postTemplate),
context: {
id: node.id,
},
});
});
const postsPerPage = 12;
const numPages = Math.ceil(posts.nodes.length / postsPerPage);
Array.from({ length: numPages }).forEach((_, i) => {
createPage({
path: i === 0 ? `/blog` : `/blog/page/${i + 1}`,
component: slash(blogTemplate),
context: {
limit: postsPerPage,
skip: i * postsPerPage,
numPages,
currentPage: i + 1,
},
})
})
};
Thanks for any information you can provide! I imagine this is probably due to me still learning to use asynchronous behavior in JS, but I just can't seem to find information on how to make this happen.
Please let me know if I can explain the situation any better!
After a rewrite, this seems to have solved the issue I was having. I'll be honest, I'm still working on completely understanding the ins and outs on async/await/promises functionality in JS, but hopefully if someone encounters a similar problem, viewing this rewrite may help:
const path = require(`path`);
const slash = require(`slash`);
const fetch = require('node-fetch');
const { createRemoteFileNode } = require(`gatsby-source-filesystem`)
exports.onCreateNode = async ({ node, actions, store, cache,createNodeId, }) => {
const { createNode, createNodeField } = actions;
const getData = async (url) => {
return new Promise((resolve, reject) => {
fetch(url)
.then((response) => response.json())
.then((data) => {
resolve(data);
});
})
}
const getLocation = async (yextID) => {
const data = await getData("https://api.yext.com/v2/accounts/me/entities?api_key=" + process.env.YEXT_API_KEY + "&v=20191114&filter=%7B%22%24anywhere%22%3A%20%22" + yextID + "%22%7D&entityTypes=healthcareFacility");
// Transform the data into json
if( data && data.response && data.response.count === 1 ){
createNodeField({
node,
name: `yextLocation`,
value: data.response.entities[0]
});
} else {
console.log("NO LOCATIONS FOUND");
}
};
const getOrthos = async (yextOrthos) => {
let orthodontists = [];
yextOrthos.forEach( (ortho, i) => {
orthodontists.push(getData("https://api.yext.com/v2/accounts/me/entities?api_key=" + process.env.YEXT_API_KEY + "&v=20191114&filter=%7B%22%24anywhere%22%3A%20%22" + ortho.acf.yext_entity_ortho_id + "%22%7D&entityTypes=healthcareProfessional"));
});
Promise.all(orthodontists).then( (orthoData) => {
if( orthoData.length ){
let finalOrthos = [];
orthoData.forEach( (finalOrtho, x) => {
finalOrthos.push(finalOrtho.response.entities[0]);
});
createNodeField({
node,
name: `yextOrthos`,
value: finalOrthos
});
} else {
console.log("NO DOCTORS FOUND");
}
});
};
const getTextOrtho = async (yextID) => {
const data = await getData("https://api.yext.com/v2/accounts/me/entities?api_key=" + process.env.YEXT_API_KEY + "&v=20191114&filter=%7B%22%24anywhere%22%3A%20%22" + yextID + "%22%7D&entityTypes=healthcareProfessional");
if( data && data.response && data.response.count === 1 ){
createNodeField({
node,
name: `yextOrthodontist`,
value: data.response.entities[0]
});
if( data.response.entities[0].googleProfilePhoto && data.response.entities[0].googleProfilePhoto.url){
createNodeField({
node,
name: `yextProfilePicture`,
value: data.response.entities[0].googleProfilePhoto.url
});
let fileNode = await createRemoteFileNode({
url: data.response.entities[0].googleProfilePhoto.url, // string that points to the URL of the image
parentNodeId: node.id, // id of the parent node of the fileNode you are going to create
createNode, // helper function in gatsby-node to generate the node
createNodeId, // helper function in gatsby-node to generate the node id
cache, // Gatsby's cache
store, // Gatsby's redux store
});
// if the file was created, attach the new node to the parent node
if (fileNode) {
node.featuredImg___NODE = fileNode.id;
console.log("GOOGLE PROFILE NODE CREATED!")
} else {
console.log("ERROR! fileNode not Created!");
}
} else {
console.log("NO GOOGLE PROFILE PHOTO FOUND");
}
} else {
console.log("NO ORTHODONTISTS FOUND");
}
};
const getWpLocations = async (wpID) => {
const data = await getData(process.env.GATSBY_WP_BASEURL+ "/wp-json/perch_endpoint/v1/locations_by_orthodontist?orthodontist_id=" + wpID);
if( data ){
createNodeField({
node,
name: `wpLocations`,
value: data
});
} else {
console.log("NO ORTHODONTISTS FOUND");
}
}
if( node.internal.type === "wordpress__wp_location"){
const yextID = node.acf.yext_entity_id;
const yextOrthos = node.acf.location_orthodontists;
try {
await getLocation(yextID);
await getOrthos(yextOrthos);
} catch (error) {
console.log(error);
}
}
if( node.internal.type === "wordpress__wp_orthodontist"){
const yextID = node.acf.yext_entity_ortho_id;
const wpID = node.wordpress_id;
try {
await getTextOrtho(yextID);
await getWpLocations(wpID);
} catch (error) {
console.log(error);
}
}
}
exports.createPages = async ({ graphql, actions }) => {
const { createPage } = actions;
const result = await graphql(`
{
locations: allWordpressWpLocation(filter: {status: {eq: "publish"}}) {
nodes {
id
path
acf {
location_orthodontists {
acf {
yext_entity_ortho_id
}
}
yext_entity_id
}
}
}
pages: allWordpressPage(
filter: {
wordpress_id: {nin: [177, 183, 8, 42, 44, 185, 46]}
status: {eq: "publish"}
}) {
nodes {
id
wordpress_id
path
}
}
orthodontists: allWordpressWpOrthodontist(filter: {status: {eq: "publish"}}) {
nodes {
id
path
}
}
posts: allWordpressPost(filter: {status: {eq: "publish"}}) {
nodes {
slug
id
}
}
}
`);
// Check for any errors
if (result.errors) {
throw new Error(result.errors);
}
const { locations, pages, orthodontists, posts } = result.data;
const locationTemplate = path.resolve(`./src/templates/location.js`);
const pageTemplate = path.resolve(`./src/templates/page.js`);
const orthoTemplate = path.resolve(`./src/templates/orthodontist.js`);
const postTemplate = path.resolve(`./src/templates/post.js`);
const blogTemplate = path.resolve(`./src/templates/blog.js`);
locations.nodes.forEach(node => {
let orthodontists = [];
node.acf.location_orthodontists.forEach(ortho => {
orthodontists.push(ortho.acf.yext_entity_ortho_id);
});
let orthodontistList = orthodontists.join();
createPage({
path: `${node.path}`,
component: slash(locationTemplate),
context: {
id: node.id,
yextId: node.acf.yext_entity_id,
yextOrthoIds: orthodontists
},
});
});
pages.nodes.forEach(node => {
createPage({
path: `${node.path}`,
component: slash(pageTemplate),
context: {
id: node.id,
},
});
});
orthodontists.nodes.forEach(node => {
createPage({
path: `${node.path}`,
component: slash(orthoTemplate),
context: {
id: node.id,
},
});
});
posts.nodes.forEach(node => {
createPage({
path: `${node.slug}`,
component: slash(postTemplate),
context: {
id: node.id,
},
});
});
const postsPerPage = 12;
const numPages = Math.ceil(posts.nodes.length / postsPerPage);
Array.from({ length: numPages }).forEach((_, i) => {
createPage({
path: i === 0 ? `/blog` : `/blog/page/${i + 1}`,
component: slash(blogTemplate),
context: {
limit: postsPerPage,
skip: i * postsPerPage,
numPages,
currentPage: i + 1,
},
})
})
};
how can i handle image uploads in graphql
Through multer using express route to handle upload and query from graphql to view the images and other data
app.use('/graphql', upload);
app.use('/graphql', getData, graphqlHTTP(tokenData => ({
schema,
pretty: true,
tokenData,
graphiql: true,
})));
This is a duplicate of How would you do file uploads in a React-Relay app?
In short, yes you can do a file upload in graphql with react + relay.
You need to write the Relay update store action, for example:
onDrop: function(files) {
files.forEach((file)=> {
Relay.Store.commitUpdate(
new AddImageMutation({
file,
images: this.props.User,
}),
{onSuccess, onFailure}
);
});
},
Then implement a mutation for the Relay store
class AddImageMutation extends Relay.Mutation {
static fragments = {
images: () => Relay.QL`
fragment on User {
id,
}`,
};
getMutation() {
return Relay.QL`mutation{ introduceImage }`;
}
getFiles() {
return {
file: this.props.file,
};
}
getVariables() {
return {
imageName: this.props.file.name,
};
}
getFatQuery() {
return Relay.QL`
fragment on IntroduceImagePayload {
User {
images(first: 30) {
edges {
node {
id,
}
}
}
},
newImageEdge,
}
`;
}
getConfigs() {
return [{
type: 'RANGE_ADD',
parentName: 'User',
parentID: this.props.images.id,
connectionName: 'images',
edgeName: 'newImageEdge',
rangeBehaviors: {
'': 'prepend',
},
}];
}
}
In you server-side schema, preform update
const imageMutation = Relay.mutationWithClientMutationId({
name: 'IntroduceImage',
inputFields: {
imageName: {
type: new GraphQL.GraphQLNonNull(GraphQL.GraphQLString),
},
},
outputFields: {
newImageEdge: {
type: ImageEdge,
resolve: (payload, args, options) => {
const file = options.rootValue.request.file;
//write the image to you disk
return uploadFile(file.buffer, filePath, filename)
.then(() => {
/* Find the offset for new edge*/
return Promise.all(
[(new myImages()).getAll(),
(new myImages()).getById(payload.insertId)])
.spread((allImages, newImage) => {
const newImageStr = JSON.stringify(newImage);
/* If edge is in list return index */
const offset = allImages.reduce((pre, ele, idx) => {
if (JSON.stringify(ele) === newImageStr) {
return idx;
}
return pre;
}, -1);
return {
cursor: offset !== -1 ? Relay.offsetToCursor(offset) : null,
node: newImage,
};
});
});
},
},
User: {
type: UserType,
resolve: () => (new myImages()).getAll(),
},
},
mutateAndGetPayload: (input) => {
//break the names to array.
let imageName = input.imageName.substring(0, input.imageName.lastIndexOf('.'));
const mimeType = input.imageName.substring(input.imageName.lastIndexOf('.'));
//wirte the image to database
return (new myImages())
.add(imageName)
.then(id => {
//prepare to wirte disk
return {
insertId: id,
imgNmae: imageName,
};
});
},
});
All the code above you can find them in this repo https://github.com/bfwg/relay-gallery
There is also a live demo http://fanjin.io