I'm trying to upload a very large object that contains several arrays within which are more arrays. I tried to upload using FormData but the upload is received as a string, like so:
Upload code:
formData.append('large_json', largObj);
As received on server:
{ "bob" : "[[Object object], [Object object]]" }
The string may be 15 million characters long. What are my options to get this to the server?
If it helps, I'm running an express server.
I think you are sending the object to the server, not the info in it.
Did you already try to transform your object to JSON?
formData.append('large_json', JSON.stringify(largObj));
Related
I have a lot of data loaded in my database where some of the documents loaded are not JSON files & just binary files. Correct data looks like this: "/foo/bar/1.json" but the incorrect data is in the format of "/foo/bar/*". Is there a mechanism in MarkLogic using JavaScript where I can filter out this junk data and delete them?
PS: I'm unable to extract files with mlcp that have a "?" in the URI and maybe when I try to reload this data I get this error. Any way to fix that extract along with this?
If all of the document URIs contain a ? and are in that directory, then you could use cts.uriMatch()
declareUpdate();
for (const uri of cts.uriMatch('/foo/bar/*?*') ) {
xdmp.documentDelete(uri)
}
Alternatively, if you are looking to find the binary() documents, you can apply the format-binary option to a cts.search() with a cts.directoryQuery() and then delete them.
declareUpdate();
for (const doc of cts.search(cts.directoryQuery("/foo/bar/"), ['format-json']) ) {
xdmp.documentDelete(fn.baseUri(doc));
}
They are probably being persisted as binary because there is no file extension when the URI ends with a question mark and some querystring parameter values i.e. 1.json?foo=bar instead of 1.json
It is difficult to diagnose and troubleshoot without seeing what your MLCP job configs are and knowing more about what you are doing to load the data.
How can I send JSON.stringify(array) data within form-data and decode the JSON with my Django api?
I'm trying to add the functionality to upload an array of date-strings within a form
originally we sent the post data using JSON and sending arrays of data worked, however, when we switched to using form-data in order to make uploading an image easier we started having problems with array types.
since form data has to be sent using a string type I converted the date-string array using JSON.stringify()
const myForm = new FormData();
myForm.set("date_strings", JSON.stringify(dateStrings));
when I post myForm to my Django + DRF API it responds with
{
"date_strings": [
"Datetime has wrong format. Use one of these formats instead: YYYY-MM-DDThh:mm[:ss[.uuuuuu]][+HH:MM|-HH:MM|Z]."
],
"status_code": 400
}
In Postman I verified that sending a single date-string works, but when I send a string-array I get the same error.
I believe my Django API tests if the request.data is valid, sees that date_strings are a JSON string, then responds with a 400 Error.
def create(self, request, *args, **kwargs):
serializer = self.get_serializer(data=request.data)
serializer.is_valid(raise_exception=True)
Attempted Solutions:
converting the JSON string to an array in the PostViewset create method
I can't change request.data['publish_dates'] because it is not mutable and I've seen some advice that you should not attempt to copy or change request data in the viewset because it isn't safe.
convert the JSON string to an array in the serializer
neither MySerializer's create nor validate methods run (I logged to test).
date_strings are formated and created as a separate PublishDate model in the create method,
class MySerializer(MyChildSerializer):
date_strings = serializers.ListField(child=serializers.DateTimeField(), min_length=1, max_length=100, write_only=True)
How can I send/accept form-data to my Django + DRF API when one attribute is an array of date-time strings?
I realized the problem and found a solution. Because MySerializer had set date_strings to be a ListField it was coming in as a string and being rejected as a bad request.
by changing the model of date_strings to a CharField, MySerializer runs the validate and create methods. I now have to find a way to convert the JSON date_strings back into an array in the serializer create method, but I solved the API's 400 issue.
MySerializer now looks like:
class CreatePostSerializer(SocialVeilModelSerializer):
publish_dates = serializers.CharField(max_length=2000)
I have a step form in a project that handles a lot of data. To prevent errors during creation, all information is stored client-side, and in the end, is sent to the server.
the information sent to the server looks like this:
{
name: "project1",
decription: "lot of text",
schedule:[{weekDay:1, startHour:"09:00", endHour:"15:00"}, ...]
tasks:["task1", "task2"... until 20/30],
files:[{file1}, {file2}, ...],
services:[{
name: "service1",
decription: "lot of text",
schedule:[{weekDay:1, startHour:"09:00", endHour:"15:00"}, ...]
tasks:["task1", "task2"... until 20/30],
files:[{file1}, {file2}, ...],
jobs:[{
name: "job1",
decription: "lot of text",
schedule:[{weekDay:1, startHour:"09:00", endHour:"15:00"}, ...]
tasks:["task1", "task2"... until 20/30],
files:[{file1}, {file2}, ...]
},{
name: "job2",
}
]
...
},{
name:"service2",
...
}
}
And so on..
This is a really reduced example, in a real enviroment there will be 1 project with about 10-15 services, each one with 4-5 jobs.
I have been able to process everything with about 15 items in the last level, and now I´m trying to preprocess data to delete objects not neeeded in the server before send, and with that I expect to be able to send over 50 items in the last level without triggering "max_input_variables exceeded xxx" server side. But still, will be very close to the limit in some cases.
I´m thinking about changing the way I send/receive data, but I´m not sure if my guesses are even correct.
Before some suggest a json request to prevent the input variables error, the request has to bee multipart/form-data to send files.
Said that, my guesses were the following:
Mount all the data as json in a single variable and keep the files in separated variables ( formData would look like {project:{hugeJSON}, files:[file1, file2], services:[{files:[...]}, {files:[...]}] } )
Send partial data during the form fill to the server and store it somewhere, (a tmp file would be my best bet) and in the last step, send only the main form information.
Probably a stupid guess, but is there something like sending chunked data? Ideally, I would like to show to the user a loading bar saying "Creating project--> Saving Service nº1 --> Generating Docs for Service 1..." I think that I could achieve this making my server-side script generate a chunked reponse, but not sure about that.
Well, any help that could show me the correct way would be really appreciated.
Tank you in advance.
Once you are finished filling your object, you should stringify it and send it to the server as a post parameter.
Once you receive it serverside, you can parse JSON and continue working.
I've an http server in node [not express]. On button click I've a get method, which then pulls documents from mongodb (using mongoose) and displays it on angular page.
on button click:
$http.get('/get').success(function(response){
console.log(response);
//logic to store JSON response of database and perform repeat to display each document returned on UI
});
In Node code where server is created using http.createServer instead of express:
if(req.url==="/get"){
res.writeHead(200,{'content-type':'text/plain'});
modelName.find({}, 'property1 prop2 prop3', function(err,docs){
res.write('response...: '+docs);
});
}
Here is my issue:
I'm able to send response from node js to angular js but how to parse it? If I don't add 'response...:' before docs then I get an error msg 'first argument should be a string or buffer'. On angular I get response like:->
response...:{_id:....1, prop1: 'a',prop2: 'b',prop3: 'c'},
{_id:....2, prop1: 'ab',prop2: 'bc',prop3: 'cd'}
I want to display documents as a tabular format
I don't know your exact setup, but I think you should transfer application/json instead of text/plain.
You cannot simply concatenate a string to docs, you need to return either only just docs (to transfer as an array) or write res.write({'response':docs}) (to transfer as an object).
Consider moving from $http to a resource service. In your resource service, you need to set isArray to false if you want to transfer as an object or to true if you transfer as an array: https://docs.angularjs.org/api/ngResource/service/$resource
I tried unsuccessful to separate JSON data from an array of buffer receiving from websocket which look like this.
{"type":"string","data":{"UklGRkIjAABXRUJQVlA4IDYjAACQswCdASqrAfAAPm0wlUemI"}}
[object ArrayBuffer]
the objective is to read them separately with proper function. My old method is parse the JSON first. If fail, pass through another function.
The reason I send them in both formats is that converting JSON data to an array or array to JSON would increase file size around three fold.
The best practice is to send them separately.
However, at the terminal I read JSON data with
var json = JSON.parse(e.data);
and read the ArrayBuffer with DataView method.
The app works properly on the surface, but if you check at console.log there are too many Uncaugth Error.
It also blocks data flow at one point, causing the stream overflow not very smooth.
Thanks for any suggestions in advance.
Got it:
if ((typeof data == 'object')) {
// this would return ArrayBuffer
} else {
// this would return JSON
}