I'm having some trouble passing into a variable that holds a json object into sendgrid's dynamic_template_data. My setup looks like this:
const send = async (address, mentions) => {
console.log('mentions json obj', mentions)
let name = "john"
try {
let config = {
headers: {
Authorization: `Bearer ${process.env.sendgridKey}`,
}
}
let data = {
personalizations: [
{
to: [
{
email: `${address}`,
},
],
dynamic_template_data: {
name: name,
allMentions: mentions
}
}
],
from: {
email: "email#email.com",
name: "Mentionscrawler Team"
},
template_id: process.env.template_id,
}
await axios.post("https://api.sendgrid.com/v3/mail/send", data, config)
} catch (error) {
console.error(error, 'failing here>>>>>>>')
}
}
when I console.log mentions, which is json, and paste the code I get from the terminal directly into the allMentions key, it works. but when I just pass in mentions itself, nothing shows up on the sent email. I've been very confused the last few hours why this is happening. Any advice appreciated.
edit: i should also note that allmentions is an object with keys that hold arrays. So I'm looking to iterate over those arrays. Again, this totally all works if I just paste in directly what mentions is, but passing in mentions is giving me an issue.
Thank you very much,
just realized what was wrong. sendgrid's template requires a json object, so I assumed that I needed to use json.stringify on my mentions obj. Turns out I didn't need to do that, as long as all values are in string format.
Related
I am building a dictionary but I would like some of the values to contain variables. is there a way to pass a variable to the dictionary so I can assign a dot notation variable? the variables object will always have the same structure and the dictionary will be static and structured the same for each key value pair. essentially I want to pass the value from the dictionary to another function to handle the data.
main.js
import myDictionary from "myDictionary.js"
const variables ={
item:"Hello"
}
const data = myDictionary[key](variables)
console.log(data)
myDictionary.js
const myDictionary = {
key: variables.item
}
so the log should display hello. I know it willl be something straightforward but cant seem to figure it out.
as always any help is greatly appreciated
You should modify the dictionary so that it keeps actual callback functions instead. Only then it will be able to accept arguments.
const myDictionary = {
key: (variables) => variables.item
}
const variables = {
item: "Hello"
}
const key = "key";
const data = myDictionary[key](variables)
console.log(data)
What you are trying to do is not possible. The myDictionary.js file has no idea whats inside you main file. The only thing you could do would be:
myDictionary.js
const myDictionary = {
key: "item"
}
main.js
import myDictionary from "myDictionary.js";
const variables = {
item: "Hello"
};
const data = variables[myDictionary["key"]];
console.log(data);
Also, even though JavaScript does not enforce semi-colons, they will save you a lot of headaches of some stupid rule that breaks the automatic inserter.
I must apologise as when I asked the question I wasn't fully clear on what I needed but after some experimentation and looking at my edge cases and after looking at Krzysztof's answer I had a thought and came up with something similar to this -
const dict = {
key: (eventData) => {
return [
{
module: 'company',
entity: 'placement',
variables: {
placement_id: {
value: eventData.id,
},
},
},
{
module: 'company',
entity: 'placement',
variables: {
client_id: {
value: eventData.client.id,
},
},
},
];
},
}
Then I'm getting the data like this -
const data = dict?.[key](eventData)
console.log(data)
I can then navigate or manipulate the data however I need.
thank you everyone who spent time to help me
I have a page that consists of 2 components and each of them has its own request for data
for example
<MovieInfo movieId={queryParamsId}/>
const GET_MOVIE_INFO = `gql
query($id: String!){
movie(id: $id){
name
description
}
}`
Next component
<MovieActors movieId={queryParamsId}/>
const GET_MOVIE_ACTORS = `gql
query($id: String!){
movie(id: $id){
actors
}
}`
For each of these queries I use apollo hook
const { data, loading, error } = useQuery(GET_DATA, {variable: {id: queryParamsId}}))
Everything is fine, but I got a warning message:
Cache data may be lost when replacing the movie field of a Query object.
To address this problem (which is not a bug in Apollo Client), either ensure all objects of type Movie have IDs, or define a custom merge function for the Query.movie field, so InMemoryCache can safely merge these objects: { ... }
It's works ok with google chrome, but this error affects Safari browser. Everything is crushing. I'm 100% sure it's because of this warning message. On the first request, I set Movie data in the cache, on the second request to the same query I just replace old data with new, so previous cached data is undefined. How can I resolve this problem?
Here is the same solution mentioned by Thomas but a bit shorter
const cache = new InMemoryCache({
typePolicies: {
Query: {
fields: {
YOUR_FIELD: {
// shorthand
merge: true,
},
},
},
},
});
This is same as the following
const cache = new InMemoryCache({
typePolicies: {
Query: {
fields: {
YOUR_FIELD: {
merge(existing, incoming, { mergeObjects }) {
return mergeObjects(existing, incoming);
},
},
},
},
},
});
Solved!
cache: new InMemoryCache({
typePolicies: {
Query: {
fields: {
YOUR_FIELD: {
merge(existing = [], incoming: any) {
return { ...existing, ...incoming };
// this part of code is depends what you actually need to do, in my
case i had to save my incoming data as single object in cache
}
}
}
}
}
})
});
The other answers still work, but as of Apollo Client >= 3.3 there's an easier option that doesn't require specifying specific fields or a custom merge function. Instead, you only have to specify the type and it will merge all fields for that type:
const cache = new InMemoryCache({
typePolicies: {
YOUR_TYPE_NAME: {
merge: true,
}
}
});
From your example query, I'd guess that an id field should be available though? Try requesting the ID in your query, that should solve the problem in a much more ideal way.
Had same issue with inconsistency of data values vs. our schema. A value type within an entity was missing the id value. Caused by an incomplete data migration.
Temporary solution:
const typePolicies = {
PROBLEM_TYPE: {
keyFields: false as false,
},
PARENT_TYPE: {
fields: {
PROBLEM_FIELD: {
merge: true
}
}
}
}
I'm trying to post data from Axios to my API but although my json data looks correct when passing to Axios, browser tools tells me that my nested property is empty.
My data contains nested objects and looks like this:
{
'name': 'foo',
'index': 1,
'nested': [
{
'date': '2020-05-10',
'geojson_data': {
'crs': Object,
'name': 'my-name',
'type': 'FeatureCollection',
'features': [{ ... }, { ... }]
}
},
]
}
Edit: geojson_data results from parsing .geojson file, thus, the features array contains ~300 items.
My axios function is defined here:
async post(data) {
console.log(data);
return axios
.post(API_URL + 'endpoint/',
data,
{
headers: authHeader()
})
.then(response => {
return response.data;
})
}
authHeader() is used to provide Bearer token authorization.
I check that data passed to axios is correct (looks like above), but browser tools tell me that data actually sent looks like
{"name":"foo","index":1,"nested":[]}
Why is my array empty ?
Edit: I tried to manually populate a sample nested object and it seems to work. Yet, whenever I use my actual data, it doesn't.
Don't know if it's relevant but in the console, here is the difference I can see between the 2 collapsed objects (actual vs sample):
Sample : Object { name: "foo", index: "1", nested: (1) […] }
Actual : Object { name: "foo", index: "1", nested: [] }
Notice the nested array which looks like it is empty. Yet, if I expand it, they look the same. Any ideas ?
P.S: looks like this SO post from 3 years ago, but without solution
I'm sending REST API requests using axios package.
I can get a single document (for example, cities/cityId):
axios.get(`https://firestore.googleapis.com/v1/projects/<PROJECTIDHERE>/databases/(default)/documents/<COLLECTIONNAME>/<DOCID>`)
What I can't do is to get a nested document (for example, cities/cityId/streetId)
The database structure is very simple.
cities: { // COLLECTION
cityId: { // DOCUMENT
streetId: { // MAP
buildingId: '...', // STRING
...
},
...
},
}
This article Google Firestore REST API examples suggests that it's possible to get nested objects using structured queries. Unfortunately, I've been trying to do it without any success.
Here's my not working code:
getQuery ({ path, token }) {
const url = 'https://firestore.googleapis.com/v1/projects/big-boobs/databases/(default)/documents:runQuery'
const params = {
from: [ { collectionId: 'cities' } ],
select: {
fields: [
{ fieldPath: 'cityId1.streetId' }
]
}
}
const options = {
method: 'get',
url,
params,
paramsSerializer: function (params) {
return Qs.stringify(params, { arrayFormat: 'brackets' })
}
}
if (token) options.headers['Authorization'] = `Bearer ${token}`
return axios(options)
}
I'm getting an error:
{
"error": {
"code": 400,
"message": "Invalid JSON payload received. Unknown name \"from[][collectionId]\": Cannot bind query parameter. Field 'from[][collectionId]' could not be found in request message.\nInvalid JSON payload received. Unknown name \"select[fields][][fieldPath]\": Cannot bind query parameter. Field 'select[fields][][fieldPath]' could not be found in request message.",
}
You can select individual fields from a document, but you can't select individual properties from a map field. I think you'll have to settle for getting the entire map called streetId.
If it's unacceptable to pull down the entire map, you can reorganize your data such that each streetId exists in its own document.
I have a basic GraphQL query setup as follows:
Query.js:
const Query = {
dogs(parent, args, ctx, info) {
return [{ name: 'Snickers' }, { name: 'Sunny' }];
},
};
module.exports = Query;
schema.graphql:
type Dog {
name: String!
}
type Query {
dogs: [Dog]!
}
I created a function createServer() for starting the server as follows:
const { GraphQLServer } = require('graphql-yoga');
const Mutation = require('./resolvers/Mutation');
const Query = require('./resolvers/Query');
const db = require('./db');
function createServer() {
return new GraphQLServer({
typeDefs: 'src/schema.graphql',
resolvers: {
Mutation,
Query,
},
resolverValidationOptions: {
requireResolversForResolveType: false,
},
context: req => ({ ...req, db }),
});
}
module.exports = createServer;
I then tried querying dogs as follows:
query {
dogs {
name
}
}
But instead of getting the names from the array of dogs, I got the following error instead:
{
"data": null,
"errors": [
{
"message": "Cannot return null for non-nullable field Query.dogs.",
"locations": [
{
"line": 2,
"column": 3
}
],
"path": [
"dogs"
]
}
]
}
What seems to be causing this error?
This problem comes from AWS requiring certain standard values in the dynamoDB table, such as createdAt and updatedAd, just add these fields manually with a timestamp in dynamo db for further testing. A mutation always needs to be requested via id, this somehow was not clear to me when my schema was created by amplify codegen...
The above code works as you can see in codesandbox: https://codesandbox.io/s/olzj9vvpk5
But when I convert Query to something like {} it returns the same error so please check your paths and console.log Query to validate the path. Your export looks correct but you might have forgotten to save the file as I can see from the course starter files Query is an {}. Please double check.
Also if this code is in a public git repo please share the link.
I know this question has been answered, but for me the only thing that fixed this issue was to also pass the info argument.
In my case, I create a new Query.js file at the src folder but I import Query with Query = require('./resolvers/Query') and coding there. So, try to check the path, I think the problem is there.